Abstract

Federated learning (FL) has become a prevalent paradigm for training a model collaboratively on multiple clients with the coordination of a central server. As traditional FL suffers from client-drift due to data heterogeneity across clients, many personalized FL (PFL) techniques have been proposed. However, the issue of privacy leakage within PFL remains inadequately addressed. When incorporating differential privacy (DP) directly into PFL to provide rigorous privacy guarantees, it amplifies heterogeneity among clients and introduces high variance in uploaded information, significantly compromising model’s utility. In this paper, we propose a novel privacy-preserving PFL framework called Differentially Private Federated Elastic weight consolidation (DP-FedEwc), to achieve effective model personalization for each client under sample-level DP. We focus on a practical setting where the server is honest-but-curious. We first implement a FedEwc algorithm in a communication-efficient manner, and provide privacy guarantees by perturbing models and their parameter importance (PI). We show that FedEwc is robust to DP-introduced heterogeneity caused by noisy models, especially when the model is a deep neural network. Since excessive noise may render PI invalid, we present an Adaptive Parameter importance Perturbation (APP) method to adaptively add Gaussian noise to different coordinates of PI, thereby alleviating the negative effect of DP noise. Moreover, to accurately calibrate the privacy cost resulting from querying heterogeneous data across various clients when computing PI through APP, we adapt a Bayesian Accountant (BA) method to DP-FedEwc. We conduct experiments on standard benchmark datasets, and the experimental results confirm the superiority of DP-FedEwc over DP-PFL baselines.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.