Abstract

Federated learning (FL) is a framework to collaboratively train a deep learning model without sharing the raw data of distributed clients. Due to the limited communication resources in wireless networks, only a subset of clients are allowed to communicate their model updates with the central server in each communication round. Existing works selected the client based on the wireless channel states or the size of local dataset, while ignoring the local model status, i.e., the content of the local updates. In this paper, we first provide a comprehensive convergence analysis to investigate the impact of the client selection strategy for FL in wireless networks. Based on the theoretical analysis, we propose a novel client selection approach, referred to as the content-aware client selection (CACS). The proposed CACS strategy selects the client subset based on both the wireless channel states and the content of model updates from the clients. Empirical experiments shall verify the analysis and demonstrate that the CACS strategy outperforms the state-of-the-art methods in terms of both the convergence speed and learning performance of FL in wireless networks.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call