Abstract

Federated learning (FL) enables the full utilization of decentralized training without raw data. However, various attacks still threaten the training process of FL. To address these concerns, differential privacy (DP) and secure multi-party computation (SMC) are applied, but these methods may result in low accuracy and heavy training load. Moreover, the high communication consumption of FL in resource-constrained devices is also a challenging problem. In this paper, we propose a novel SMC algorithm for the FL (FL-IPFE) to protect the local gradients. It does not require a trusted third party (TTP) and is more suitable for FL. Furthermore, we propose a secure and efficient FL algorithm (SEFL), which applies compressed sensing (CS) and all-or-nothing transform (AONT) to minimize the number of transmitted and encrypted model updates. Additionally, our FL-IPFE is used to encrypt the last element of the preprocessed parameters for guaranteeing the security of the entire local model updates. Meanwhile, the issue of participant dropouts is also taken into account. Theoretical analyses demonstrate that our proposed algorithms can aggregate model updates with high security. Finally, experimental evaluation reveals that our SEFL possesses higher efficiency compared to other state-of-the-art works, while providing comparable model accuracy and strong privacy guarantees.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call