Abstract

A distributed learning problem over multiple access channel (MAC) using a large wireless network is considered. The objective function is a sum of the nodes’ local loss functions. The inference decision is made by the network edge and is based on received data from distributed nodes which transmit over a noisy fading MAC. We develop a novel Gradient-Based Multiple Access (GBMA) algorithm to solve the distributed learning problem over MAC. Specifically, the nodes transmit an analog function of the local gradient using common shaping waveforms. The network edge receives a superposition of the analog transmitted signals which represents a noisy distorted gradient used for updating the estimate. We analyze the performance of GBMA theoretically, and prove that it can approach the convergence rate of the centralized gradient descent (GD) algorithm in large networks under both convex and strongly convex loss functions with Lipschitz gradient.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.