Abstract

There is a significant recent interest in collaboratively training a machine learning (ML) model without collecting data to a central server. Federated learning (FL) emerges as an efficient solution mitigating systemic privacy risks and communication costs. However, conventional FL inherited from parameter server designs relies too much on a central server, which may lead to privacy risks, communication bottlenecks, or a single point of failure. In this paper, we propose an asynchronous and hierarchical local gradient aggregation and global model update algorithm, FedDual, under three different security considerations for FL in large decentralized networks. Particularly, FedDual preserves privacy by introducing local differential privacy (LDP) and aggregates local gradients asynchronously and hierarchically via a pair-wise gossip algorithm, which is more competitive than previous gossip-based decentralized FL methods in terms of privacy preservation and communication efficiency, and offers more computational efficiency compared to existing blockchain-assisted decentralized FL methods. Further, we devise a noise cutting trick based on Private Set Intersection (PSI) to mitigate the prediction performance loss of the global model caused by the leveraged LDP. Rigorous analyses show that FedDual helps decentralized FL achieve the same convergence rate of <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">$\mathcal {O}\left({\frac {1}{T}}\right) $ </tex-math></inline-formula> as centralized ML theoretically. Ingenious experiments on MNIST, CIFAR-10, and FEMNIST confirm that the model prediction performance gained from FedDual is close to centralized ML. More importantly, the proposed noise cutting trick helps FedDual to train better global models than LDP-based FL methods in terms of prediction performance and convergence rate.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.