Abstract

Federated learning provides a communication- efficient training process via alternating between local training and averaging updated local model. Nevertheless, it requires perfectly acquisition of the model which is hard to achieve in wireless communication practically, and the noise will cause serious effect on federated learning. To tackle this challenge, we propose a robust design for federated learning to decline the effect of noise. Considering the noise in communication steps, we first formulate the problem as the parallel optimization for each node under worst-case model. We utilize the sampling-based successive convex approximation algorithm to develop a feasible training scheme, due to the unavailable maxima noise condition and non-convex issue of the objective function. In addition, the convergence rate of proposed design are analyzed from a theoretical point of view. Finally, the prediction accuracy improvement and loss function value reduction of the proposed design are demonstrated via simulation.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call