Abstract

Open set domain adaptation (OSDA) methods have been proposed to leverage the difference between the source and target domains, as well as to recognize the known and unknown classes in the target domain. Such methods typically require the entire source and target data simultaneously to train the target model. However, in real scenarios, data are distributed and stored in various clients. They cannot be exchanged among clients because of privacy protection. Federated learning (FL) is a decentralized approach for training an effective global model with the training data distributed among the clients. Despite its potential in addressing the privacy concerns of data sharing, FL methods for OSDA that can handle unknown classes is not yet available. To tackle this problem, we have developed a novel federated OSDA (FOSDA) algorithm. More specifically, FOSDA adopts an uncertainty-aware mechanism to generate a global model from all client models. It reduces the uncertainty of the federated aggregation by focusing on the contribution of source clients with high uncertainty while retaining those with high consistency. Moreover, a federated class-based weighted strategy is also implemented in FOSDA to maintain the category information of the source clients. We have conducted comprehensive experiments on three benchmark datasets to evaluate the performance of the proposed method, and the results demonstrate the effectiveness of FOSDA.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call