Παρακαλώ χρησιμοποιήστε αυτό το αναγνωριστικό για να παραπέμψετε ή να δημιουργήσετε σύνδεσμο προς αυτό το τεκμήριο: https://ruomo.lib.uom.gr/handle/7000/721
Πλήρης εγγραφή μεταδεδομένων
Πεδίο DCΤιμήΓλώσσα
dc.contributor.authorNikolaidis, Spyridon-
dc.contributor.authorRefanidis, Ioannis-
dc.date.accessioned2020-05-21T16:13:25Z-
dc.date.available2020-05-21T16:13:25Z-
dc.date.issued2020-05-11-
dc.identifier10.1007/s00521-020-04880-0en_US
dc.identifier.issn0941-0643en_US
dc.identifier.urihttps://doi.org/10.1007/s00521-020-04880-0en_US
dc.identifier.urihttps://ruomo.lib.uom.gr/handle/7000/721-
dc.description.abstractLEARNAE is a system aiming to achieve a fully distributed way of neural network training. It follows a ‘‘Vires in Numeris’’ approach, combining the resources of commodity personal computers. It has a full peer-to-peer model of operation; all participating nodes share the exact same privileges and obligations. Another significant feature of LEARNAE is its high degree of fault tolerance. All training data and metadata are propagated through the network using resilient gossip protocols. This robust approach is essential in environments with unreliable connections and frequently changing set of nodes. It is based on a versatile working scheme and supports different roles, depending on processing power and training data availability of each peer. In this way, it allows an expanded application scope, ranging from powerful workstations to online sensors. To maintain a decentralized architecture, all underlying tech should be fully distributed too. LEARNAE’s coordinating algorithm is platform agnostic, but for the purpose of this research two novel projects have been used: (1) IPFS, a decentralized filesystem, as a means to distribute data in a permissionless environment and (2) IOTA, a decentralized network targeting the world of low energy ‘‘Internet of Things’’ devices. In our previous work, a first approach was attempted on the feasibility of using distributed ledger technology to collaboratively train a neural network. Now, our research is extended by applying LEARNAE to a fully deployed computer network and drawing the first experimental results. This article focuses on use cases that require data privacy; thus, there is only exchanging of model weights and not training data.en_US
dc.language.isoenen_US
dc.publisherSpringeren_US
dc.sourceNeural Computing and Applicationsen_US
dc.subjectFRASCATI::Engineering and technology::Electrical engineering, Electronic engineering, Information engineeringen_US
dc.subject.otherDecentralized neural network trainingen_US
dc.subject.otherData privacyen_US
dc.subject.otherWeight averagingen_US
dc.subject.otherDistributed ledger technologyen_US
dc.subject.otherIPFSen_US
dc.subject.otherIOTAen_US
dc.titlePrivacy preserving distributed training of neural networksen_US
dc.typeArticleen_US
dc.contributor.departmentΤμήμα Εφαρμοσμένης Πληροφορικήςen_US
local.identifier.volumetitleS.I. : EMERGING APPLICATIONS OF DEEP LEARNING AND SPIKING ANNen_US
local.identifier.eissn1433-3058en_US
Εμφανίζεται στις Συλλογές: Τμήμα Εφαρμοσμένης Πληροφορικής

Αρχεία σε αυτό το Τεκμήριο:
Αρχείο Περιγραφή ΜέγεθοςΜορφότυπος 
Learnae_Article_POSTPRINT.pdfpostprint1,27 MBAdobe PDFΠροβολή/Ανοιγμα


Τα τεκμήρια στο Αποθετήριο προστατεύονται από πνευματικά δικαιώματα, εκτός αν αναφέρεται κάτι διαφορετικό.