Abstract
Federated learning allows multiple clients to jointly train a model on their private data without revealing their local data to a centralized server. Thereby, federated learning has attracted increasing attention in recent years, and many algorithms have been proposed. However, existing federated learning algorithms often focus on static data, which tend to fail in the scenarios of data streams. Due to the varying distributions/concepts within and among the clients over time, the joint learning model must learn these different emerging concepts dynamically and simultaneously. The task becomes more challenging when the continuous arriving data are partially labeled for the participating clients. In this paper, we propose SFLEDS (Semi-supervised Federated Learning on Evolving Data Streams), a new federated learning prototype-based method to tackle the problems of label scarcity, concept drift, and privacy preservation in the federated semi-supervised evolving data stream environment. Extensive experiments show that SFLEDS outperforms both state-of-the-art semi-supervised and supervised algorithms. The source code for the proposed method is publicly available on github (https://github.com/mvisionai/FedLimited).
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have