Abstract

AbstractIn this paper we introduce a scalable, privacy-preserving, federated learning framework, coined FLoBC, based on the concept of distributed ledgers underlying blockchains. This is motivated by the rapid growth of data worldwide, especially decentralized data which calls for scalable, decenteralized machine learning models which is capable of preserving the privacy of the data of the participating users. Towards this objective, we first motivate and define the problem scope. We then introduce the proposed FLoBC system architecture hinging on a number of key pillars, namely parallelism, decentralization and node update synchronization. In particular, we examine a number of known node update synchronization policies and examine their performance merits and design trade-offs. Finally, we compare the proposed federated learning system to a centralized learning system baseline to demonstrate its performance merits. Our main finding in this paper is that our proposed decentralized learning framework was able to achieve comparable performance to a classic centralized learning system, while distributing the model training process across multiple nodes without sharing their actual data. This provides a scalable, privacy-preserving solution for training a variety of large machine learning models. Graphical abstract

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call