Abstract

Federated learning (FL) is a well-established distributed machine-learning paradigm that enables training global models on massively distributed data i.e., training on multi-owner data. However, classic FL algorithms, such as Federated Averaging (FedAvg), generally underperform when faced with Non-Independent and Identically Distributed (Non-IID) data. Such a problem is aggravated for some hyperparametric methods such as optimizers, regularization, and normalization techniques. In this paper, we introduce FedBS, a new efficient strategy to handle global models having batch normalization layers, in the presence of Non-IID data. FedBS modifies FedAvg by introducing a new aggregation rule at the server-side, while also retaining full compatibility with Batch Normalization (BN). Through our evaluations, we have empirically proven that FedBS outperforms both classical FedAvg, as well as the state-of-the-art FedProx through a comprehensive set of experiments conducted on Cifar-10, Mnist, and Fashion-Mnist datasets under various Non-IID data settings. Furthermore, we observed that in some cases, FedBS can be 2× faster than other FL approaches, coupled with higher testing accuracy.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.