Abstract

Federated learning achieves data privacy protection by uploading only users’ local updates. However, recent research has shown that privacy data can still be inferred from local updates. Privacy-preserving federated learning also faces challenges related to the confidentiality and privacy of local updates, the robustness of dynamic membership schemes, and the determination of appropriate clipping threshold in differentially private federated learning. To address these issues, this paper presents the EPFL-DAC scheme: Enhancing Privacy in Federated Learning with Dynamic Aggregation and Clipping. Firstly, Paillier homomorphic encryption is employed to securely aggregate local updates, ensuring that the server can only access the aggregated information. Secondly, a dynamic robustness algorithm is designed to ensure the robustness of the aggregation scheme when user states change. Lastly, a dynamic threshold determination method is introduced to introduce more accurate noise, while also providing resistance against collusive attacks in differentially private settings. Experimental results demonstrate the effectiveness, low computational overhead, and preservation of accuracy in the dynamic aggregation scheme. Additionally, the proposed dynamic threshold determination method exhibits good performance, especially at higher privacy protection levels and larger initial clipping threshold values.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.