Abstract

The traditional approach in FL tries to learn a single global model collaboratively with the help of many clients under the orchestration of a central server. However, learning a single global model might not work well for all clients participating in the FL under data heterogeneity. Therefore, the personalization of the global model becomes crucial in handling the challenges that arise with statistical heterogeneity and the non-IID distribution of data. Unlike prior works, in this work we propose a new approach for obtaining a personalized model from a client-level objective. This further motivates all clients to participate in federation even under statistical heterogeneity in order to improve their performance, instead of merely being a source of data and model training for the central server. To realize this personalization, we leverage finding a small subnetwork for each client by applying hybrid pruning (combination of structured and unstructured pruning), and unstructured pruning. Through a range of experiments on different benchmarks, we observed that the clients with similar data (labels) share similar personal parameters. By finding a subnetwork for each client rather than taking the average over all parameters of all clients for the entire federation as in traditional FL, we efficiently calculate the averaging on the remaining parameters of each subnetwork of each client. We call this novel parameter averaging as Sub-FedAvg. Furthermore, in our proposed approach, the clients are not required to have knowledge of any underlying data distributions or label similarities among the rest of clients. The non-IID nature of each client’s local data provides distinguishing subnetworks without sharing any data. We evaluate our method on federated image classification with real world datasets. Our method outperforms existing state-of-the-art.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call