Abstract

Federated learning (FL) is an important approach to cooperate with multiple devices for learning without exchanging data between devices and central server. However, due to bandwidth and other reasons, the communication efficiency should be considered when the volume of information transmitted is limited. In this paper, we utilize the tool of lattice quantization form quantization theory and the variable intercommunication interval to improve communication efficiency. Meanwhile, to make strong privacy guarantee, we incorporate the notion of differential privacy (DP) to the FL framework with local SGD algorithm. By adding calibrated noises, we propose a universal lattice quantization for differentially private federated averaging algorithm (ULQ-DP-FedAvg). We provide tight privacy bound by using some privacy techniques. We also analyze the convergence bound of ULQ-DP-FedAvg based on bits rate constraints and the growing inter-communication interval as well as the data are non-independent identically distribution (Non-IID). It turns out that the algorithm converges and preserves that the privacy has scarcely influenced on the convergence rate. The effectiveness of our algorithm is demonstrated by synthetic and real datasets.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call