Abstract

We propose Collaborative Learning with Synonyms (CLSyn), a robust and versatile collaborative machine learning framework that can tolerate unexpected client absence during training while maintaining high model accuracy. Client absence during collaborative training can seriously degrade model performances, particularly for unbalanced and non-IID client data. We address this issue by introducing the notion of data digests of the training samples from the clients. The expansion of digests called synonyms can represent the original samples on the server and thus maintain overall model accuracy, even after the clients become unavailable. We compare our CLSyn implementations against three centralized Federated Learning algorithms, namely FedAvg, FedProx, and FedNova as baselines. Results on CIFAR-10, CIFAR-100, and EMNIST show that CLSyn consistently outperforms these baselines by significant margins in various client absence scenarios.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.