Abstract
Federated Learning (FL) always involves a large number of heterogeneous clients (both in statistic and resource heterogeneity) when collaboratively training a model, leading to a compromise in the model performance. Recent research has focused on customizing FL frameworks to address the issues. However, compared to models trained on independent and identically distributed (IID) data, these methods still face performance degradation in non-IID scenarios. Moreover, resource consumption is also somewhat expensive. In this work, we present an efficient FL framework named FedRich to tackle the statistic and resource heterogeneity. The key idea of FedRich is adaptive segmentation of the model and heuristic scheduling of the active clients. Adaptive segmentation enables resource-dependent customization of the model, which is conducive to clients with varying resource budgets to conduct local training. The heuristic scheduling strategy appropriately selects clients to participate in federated training, mitigating statistical heterogeneity. Moreover, FedRich incorporates a hierarchical aggregation mechanism to stably aggregate heterogeneous models of different sophistication. Extensive experimental results on three benchmark datasets demonstrate that FedRich outperforms state-of-the-art heterogeneous FL approaches.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.