Abstract

Federated learning (FL) is an emerging distributed learning algorithm where the process of data acquisition and computation are decoupled to preserve users’ data privacy. In the training process, model weights have to be updated at both base station (BS) and local users sides. These weights, when exchanged between users and BS, are subjected to imperfections in uplink (UL) and downlink (DL) transmissions due to limited reliability of wireless channels. In this paper, for a FL algorithm in a single-cell massive MIMO cellular communication system, we investigate the impacts of both DL and UL transmissions and improve the communication-efficiency by adjusting global communication rounds, transmit power and average codeword length after quantization. Simulation results on standard MNIST dataset with both i.i.d and non-i.i.d training data distributions are also presented. Our simulation results have shown accelerated learning for various local steps and transmit power. The network energy consumption has been reduced while achieving similar testing accuracy at higher iterations.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call