Abstract

In this work, we present a blockchain-based federated learning (FL) framework that aims achieving high system efficiency while simultaneously addressing issues relating to data sparsity and the disclosure of private information. It is more efficient to build a number of smaller clusters rather than one big cluster for multiple networks. Blockchain-based FL is carried out in each cluster, with the model changes being compiled at the end of the process. Following that, the accumulated updates are swapped across the clusters, which, in practise, improves the updates that are accessible for each cluster. When compared to the extensive interactions that take place in blockchain-based FL, cluster-based FL only sends a limited number of aggregated updates across a substantial distance. This is in contrast to the extensive interactions that take place in blockchain-based FL. In order to conduct an analysis of our system, we have implemented the prototypes of both cluster and blockchain-based FL models. The findings of the experiments show that cluster-based FL model raise the accuracy goes upto 72.6%, and goes down to 11%. The loss goes upto 3.6 and goes down to 0.8. In addition, cluster-based FL model has the potential to hasten the convergence of the model, provided that the same quantity of data is input into it. The reason for this is due to the fact that during a training cycle, cluster-based FL model combines the computational resources of many different clusters.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call