Abstract

As an emerging joint learning model, federated learning is a promising way to combine model parameters of different users for training and inference without collecting users’ original data. However, a practical and efficient solution has not been established in previous work due to the absence of efficient matrix computation and cryptography schemes in the privacy-preserving federated learning model, especially in partially homomorphic cryptosystems. In this paper, we propose a Practical and Efficient Privacy-preserving Federated Learning (PEPFL) framework. First, we present a lifted distributed ElGamal cryptosystem for federated learning, which can solve the multi-key problem in federated learning. Secondly, we develop a Practical Partially Single Instruction Multiple Data (PSIMD) parallelism scheme that can encode a plaintext matrix into single plaintext for encryption, improving the encryption efficiency and reducing the communication cost in partially homomorphic cryptosystem. In addition, based on the Convolutional Neural Network (CNN) and the designed cryptosystem, a novel privacy-preserving federated learning framework is designed by using Momentum Gradient Descent (MGD). Finally, we evaluate the security and performance of PEPFL. The experiment results demonstrate that the scheme is practicable, effective, and secure with low communication and computation costs.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call