Abstract
Wireless federated learning (WFL) trains machine learning (ML) models on wireless edge devices in a distributed manner without the need of collecting data from users. In WFL, the quality of a local model update depends on the variance of the local stochastic gradient, determined by the mini-batch data size used to compute the update. In this paper, we study quality-aware distributed computation for WFL with non-convex problems and asynchronous algorithms, using mini-batch size as a "knob" to control the quality of users' local updates. We first characterize performance bounds on the training loss as a function of local updates' quality over the training process, for both non-convex and asynchronous settings. Our findings reveal that the impact of a local update's quality on the training loss 1) increases with the stepsize used for that local update for non- convex learning, and 2) increases when there are more other users' local updates which are coupled with that local update (depending on the update delays) for asynchronous learning. Based on these useful insights, we design channel-aware adaptive algorithms that determine users' mini-batch sizes over the training process, based on the impacts of local updates' quality on the training loss as well as users' wireless channel conditions (which determine the update delays) and computation costs. We evaluate the proposed quality- aware adaptive algorithms using simulations, which demonstrate improved learning accuracy and learning cost.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.