Abstract

To achieve communication-efficient federated multi-task learning (FMTL), we propose an over-the-air FMTL (OA-FMTL) framework, where multiple learning tasks deployed on edge devices share a non-orthogonal fading channel under the coordination of an edge server (ES). To overcome the inter-task interference inherent in the non-orthogonal transmission among tasks, we design a novel transmission method called model sparsification and random compression (MSRC) as well as a reception method called modified turbo compressed sensing (M-Turbo-CS). More specifically, at each edge device, the local model updates of all tasks are first sparsified and <italic xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">randomly</i> compressed with different random compression matrices for different tasks, before being superimposed and sent over the uplink channel. Then the ES constructes the model aggregations of all the tasks from the channel observation data through a modified version of the turbo compressed sensing (Turbo-CS) algorithm called M-Turbo-CS. We analyze the performance of the proposed OA-FMTL framework with MSRC and M-Turbo-CS. Based on the analysis, we formulate a communication-learning optimization problem to improve the system performance by adjusting the power allocation among the tasks at the edge devices. Numerical simulations show that our proposed OA-FMTL efficiently suppresses the inter-task interference to achieve a learning performance comparable to the inter-task interference free bound at a significantly reduced communication overhead. It is also shown that the proposed inter-task power allocation optimization algorithm further reduces the overall communication overhead by appropriately adjusting the power allocation among the tasks.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.