Abstract
Due to the dynamics of wireless channels and limited wireless resources (i.e., spectrum), deploying federated learning (FL) over wireless networks is challenged by frequent FL parameter transmission errors and incomplete FL model aggregation from devices. In order to overcome these challenges, we propose a Global MOdel REuse strategy (GoMORE) that reuses global FL models received at previous FL training iterations to replace erroneous local models caused by imperfect wireless transmissions. We analytically prove that the proposed GoMORE is strictly superior over the existing strategy, especially at low signal-to-noise ratios (SNRs). Based on the derived expression of weight divergence, we optimize the number of devices that participate in the model aggregation to maximize the FL performance with limited communication resources. Numerical results verify that the proposed GoMORE successfully approaches the performance upper bound by an ideal transmission. It also mitigates the negative impacts of non-independent and non-identically distributed (non-IID) data while achieving over 5 dB reduction in energy consumption.
Submitted Version (Free)
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have