Abstract

The concept of Federated Learning (FL), which is vital to the development of machine learning, has emerged as a convergence of machine learning, information, and communication technology. However, general FL settings cannot meet requirements in decentralized environments, especially in peer-to-peer (P2P) networks, where a fully connected central server is unavailable due to limited communication ranges. In this paper, to satisfy the requirements of security, privacy preservation, and robustness in the context of FL in P2P networks, we propose a decentralized global model training protocol, named PPT. Particularly, PPT aggregates local model update parameters in a single-hop manner and uses the symmetric cryptosystem to ensure secure communications between network nodes where an enhanced Eschenauer-Gligor (E-G) scheme is proposed for secure key distribution. Further, PPT generates random noise for privacy preservation without reducing the model accuracy since the noise is eliminated ultimately. PPT also adopts game theory to resist collusion attacks. In addition, PPT has elaborate designs in terms of communication efficiency and dropout-robustness. Through extensive analysis, we demonstrate that PPT can resist various security threats and preserve user privacy. Ingenious experiments on Trec05, Trec06p, Trec07, and SMS Spam Collection v.1 confirm the 20× and 12× improvement of computation efficiency that PPT achieves compared to Google’s Secure Aggregation and Local Differential Privacy (LDP)-based FL methods. More importantly, the global model trained by PPT is better than that trained by LDP-based FL methods in terms of prediction performance (about 14% and 1% improvement in convergence rate and accuracy).

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.