Abstract

Recently, it has been shown that analog transmission based federated learning enables more efficient usage of communication resources compared to the conventional digital transmission. In this paper, we propose an effective model compression strategy enabling analog FL under constrained communication bandwidth. To this end, the proposed approach is based on pattern shared sparsification by setting the same sparsification pattern of parameter vectors uploaded by edge devices, as opposed to each edge device independently applying sparsification. In particular, we propose specific schemes for determining the sparsification pattern and characterize the convergence of analog FL leveraging these proposed sparsification strategies, by deriving a closed-form upper boun d of convergence rate and residual error. The closed-form expression allows to capture the effect of communication bandwidth and power budget to the performance of analog FL. In terms of convergence analysis, the model parameter obtained with the proposed schemes is proven to converge to the optimum of model parameter. Numerical results show that leveraging the proposed pattern shared sparsification consistently improves the performance of analog FL in various settings of system parameters. The improvement in performance is more significant under scarce communication bandwidth and limited transmit power budget.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.