Abstract

This paper aims to develop a differential private federated learning (FL) scheme with the least artificial noises added while minimizing the energy consumption of participating mobile devices. By observing that some communication efficient FL approaches and even the nature of wireless communications contribute to the differential privacy (DP) preservation of training data on mobile devices, in this paper, we propose to jointly leverage gradient compression techniques (i.e., gradient quantization and sparsification) and additive white Gaussian noises (AWGN) in wireless channels to develop a piggyback DP approach for FL over mobile devices. Even with the piggyback DP approach, information distortion caused by gradient compression and noise perturbation may slow down FL convergence, which in turn consumes more energy of mobile devices for local computing and model update communications. Thus, we theoretically analyze FL convergence and formulate an energy efficient FL optimization under piggyback DP, transmission power, and FL convergence constraints. Furthermore, we propose an efficient iterative algorithm where closed-form solutions for artificial DP noise and power control are derived. Extensive simulation and experimental results demonstrate the effectiveness of the proposed scheme in terms of energy efficiency and privacy preservation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call