Abstract
Federated Learning (FL) is a distributed machine learning approach that preserves privacy by allowing numerous devices to collaboratively train a global model without sharing raw data. However, the frequent exchange of model updates between numerous devices and the central server, and some model updates are similar and redundant, resulting in a waste of communication and computation. Selecting a subset of all devices for FL training can mitigate this issue. Nevertheless, most existing device selection methods are biased, while unbiased methods often perform unstable on Non-Independent Identically Distributed (Non-IID) and unbalanced data. To address this, we propose a stable Diversity-aware Unbiased Device Selection (DUDS) method for FL on Non-IID and unbalanced data. DUDS diversifies the participation probabilities for device sampling in each FL training round, mitigating the randomness of the individual device selection process. By using a leader-based cluster adjustment mechanism to meet unbiased selection constraints, DUDS achieves stable convergence and results close to the optimal, as if all devices participated. Extensive experiments demonstrate the effectiveness of DUDS on Non-IID and unbalanced data scenarios in FL.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.