Abstract

Federated edge learning (FEEL) is envisioned as a promising paradigm to achieve privacy-preserving distributed learning. However, it consumes excessive learning time due to the existence of straggler devices caused by the heterogeneity of wireless channels and edge devices' resources. In this paper, a novel topology-optimized federated edge learning (TOFEL) scheme is proposed to tackle the heterogeneity issue in federated learning, so as to improve the communication-and-computation efficiency. Specifically, a problem of jointly optimizing the gradient aggregation topology and computing speed is formulated to minimize the weighted summation of energy consumption and latency. To solve the mixed-integer nonlinear problem, we propose a novel penalty-based successive convex approximation method, which converges to a stationary point of the primal problem under mild conditions. Simulation results demonstrate that the proposed TOFEL scheme remarkably accelerates the federated learning process, and achieves a higher energy efficiency.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call