Please use this identifier to cite or link to this item:
Title: Privacy preserving distributed training of neural networks
Authors: Nikolaidis, Spyridon
Refanidis, Ioannis
Type: Article
Subjects: FRASCATI::Engineering and technology::Electrical engineering, Electronic engineering, Information engineering
Keywords: Decentralized neural network training
Data privacy
Weight averaging
Distributed ledger technology
Issue Date: 11-May-2020
Publisher: Springer
Source: Neural Computing and Applications
Abstract: LEARNAE is a system aiming to achieve a fully distributed way of neural network training. It follows a ‘‘Vires in Numeris’’ approach, combining the resources of commodity personal computers. It has a full peer-to-peer model of operation; all participating nodes share the exact same privileges and obligations. Another significant feature of LEARNAE is its high degree of fault tolerance. All training data and metadata are propagated through the network using resilient gossip protocols. This robust approach is essential in environments with unreliable connections and frequently changing set of nodes. It is based on a versatile working scheme and supports different roles, depending on processing power and training data availability of each peer. In this way, it allows an expanded application scope, ranging from powerful workstations to online sensors. To maintain a decentralized architecture, all underlying tech should be fully distributed too. LEARNAE’s coordinating algorithm is platform agnostic, but for the purpose of this research two novel projects have been used: (1) IPFS, a decentralized filesystem, as a means to distribute data in a permissionless environment and (2) IOTA, a decentralized network targeting the world of low energy ‘‘Internet of Things’’ devices. In our previous work, a first approach was attempted on the feasibility of using distributed ledger technology to collaboratively train a neural network. Now, our research is extended by applying LEARNAE to a fully deployed computer network and drawing the first experimental results. This article focuses on use cases that require data privacy; thus, there is only exchanging of model weights and not training data.
ISSN: 0941-0643
Electronic ISSN: 1433-3058
Other Identifiers: 10.1007/s00521-020-04880-0
Appears in Collections:Department of Applied Informatics

Files in This Item:
File Description SizeFormat 
Learnae_Article_POSTPRINT.pdfpostprint1,27 MBAdobe PDFView/Open

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.