Abstract

Federated Learning (FL) is a machine learning technique, where collaborative and distributed learning is performed, while the private data reside locally on the client. Rather than the data, only gradients are shared among all collaborative nodes with the help of a central server. To ensure the data privacy, the gradients are prone to the deformation, or the representation is perturbed before sharing, ultimately reducing the performance of the model. Recent studies show that the original data can still be recovered using latent space (i.e., gradient leakage problem) by Generative Adversarial Network and different optimization algorithms such as Bayesian and Covariance Matrix Adaptation Evolution Strategy. To address the issues of data privacy and gradient leakage, in this paper, we train deep neural networks by exploiting the blockchain-based Swarm Learning (SL) framework. In the SL scheme, instead of sharing perturbed or noisy gradients to the central server, we share the gradients among authenticated (i.e., blockchain-based smart contract) training nodes. To demonstrate the effectiveness of the SL approach, we evaluate the proposed approach using the standard CIFAR10 and MNIST benchmark datasets and compare it with the other existing methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call