Abstract

Federated learning (FL) enables multiple clients to jointly train high-performance deep learning models while maintaining the training data locally. However, it is challenging to accomplish this form of efficient collaborative learning when all the clients’ local data are not independent and identically distributed (i.e., non-IID). Despite extensive efforts to address this challenge, the results for image classification tasks remain inadequate. In this paper, we propose FedProc: prototypical contrastive federated learning. The core idea of this approach is to utilize the prototypes as global knowledge to correct the drift of each client’s local training. Specifically, we designed a local network structure and a global prototype contrast loss to regulate the training of the local model. These efforts make the direction of local optimization consistent with the global optimum such that the global model achieves good performance on non-IID data. Evaluative studies supported by theoretical significance demonstrate that FedProc improves accuracy by 1.6% to 7.9% with an acceptable computational cost compared to state-of-the-art federated learning methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call