Abstract

The requirement for data sharing and privacy has brought increasing attention to federated learning. However, the existing aggregation models are too specialized and deal less with users’ withdrawal issue. Moreover, protocols for multiparty entity matching are rarely covered. Thus, there is no systematic framework to perform federated learning tasks. In this paper, we systematically propose a privacy‐preserving federated learning framework (PFLF) where we first construct a general secure aggregation model in federated learning scenarios by combining the Shamir secret sharing with homomorphic cryptography to ensure that the aggregated value can be decrypted correctly only when the number of participants is greater than t. Furthermore, we propose a multiparty entity matching protocol by employing secure multiparty computing to solve the entity alignment problems and a logistic regression algorithm to achieve privacy‐preserving model training and support the withdrawal of users in vertical federated learning (VFL) scenarios. Finally, the security analyses prove that PFLF preserves the data privacy in the honest‐but‐curious model, and the experimental evaluations show PFLF attains consistent accuracy with the original model and demonstrates the practical feasibility.

Highlights

  • In 2016, AlphaGo used 300,000 sets of flag games as training data and beat the world’s top professional go players

  • To solve the above problems, we propose a novel preserving federated learning framework (PFLF) with general aggregation and multiparty entity matching

  • We propose a general aggregation model (GAM) that can be used in many applications where aggregation is required and privacy is protected

Read more

Summary

Introduction

In 2016, AlphaGo used 300,000 sets of flag games as training data and beat the world’s top professional go players. To solve the above problems, we propose a novel PFLF with general aggregation and multiparty entity matching In this framework, we propose a general aggregation model (GAM) that can be used in many applications where aggregation is required and privacy is protected. In the GAM, there is little interaction with other participants (iii) We propose MEMP to confirm common entities based on GAM and the multiplicative homomorphism of RSA It can enable participants with different data characteristics to determine common entities without leaking or inferring other useful information (iv) We design a privacy-preserving VFLR by using Paillier homomorphic encryption and merging GAM and LR. It can train a secure VFLR and support the withdrawal of participants.

Preliminary
System Architecture
Construction of PFLF
Secure Model Building
3: Ui computes
12: Each participant and CS can update weight parameter
Security Analysis
Performance Evaluation
Related Work
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.