Abstract

This paper focuses on the problem of personalized federated learning (FL) with the schema of contrastive learning (CL), which is to implement collaborative pattern classification by many clients. The traditional FL frameworks mostly facilitate the global model for the server and the local models for the clients to be similar, often ignoring the data heterogeneity of the clients. Aiming at achieving better performance in clients, this study introduces a personalized federated contrastive learning model, dubbed PerFCL, by proposing a new approach to doubly contrastive representation learning (DCL). Concretely, PerFCL borrows the DCL scheme, where one CL loss compares the shared parts of local models with the global model and the other CL loss compares the personalized parts of local models with the global model. To encourage the difference between the two parts, we created a double optimization problem composed of maximizing the comparison agreement for the former and minimizing the comparison agreement for the latter. We evaluated the proposed model on three publicly available data sets for federated image classification. Experiment results show that PerFCL benefits from the proposed DCL strategy and performs better than the state-of-the-art federated-learning models.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call