Abstract
Federated learning enables multiple participants to cooperatively train a model, where each participant computes gradients on its data and a coordinator aggregates gradients from participants to orchestrate training. To preserve data privacy, gradients need to be protected during training. Pairwise masking satisfies the requirement, which allows participants to blind gradients with masks and the coordinator to perform aggregation in the blinded field. However, the solution would leak aggregated results to external adversaries (e.g., an adversarial coordinator), which suffers from quantity inference attacks. Additionally, existing pairwise masking-based schemes rely on a central coordinator and are vulnerable to the single-point-of-failure problem. To address these issues, we propose a decentralized privacy-preserving federated learning scheme called GAIN. GAIN blinds gradients with masks and encrypts blinded gradients using additively homomorphic encryption, which ensures the confidentiality of gradients, and discloses nothing about aggregated results to external adversaries to resist quantity inference attacks. In GAIN, we design a derivation mechanism for generation of masks, where masks are derived from shared keys established by a single key agreement. The mechanism reduces the computation and communication costs of existing schemes. Furthermore, GAIN introduces smart contracts over blockchains to aggregate gradients in a decentralized manner, which addresses the single-point-of-failure problem. Smart contracts also provide verifiability for model training. We present security analysis to demonstrate the security of GAIN, and conduct comprehensive experiments to evaluate its performance.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.