Federated learning, as a privacy-preserving learning paradigm, restricts the access to data of each local client, for protecting the privacy of the parties. However, in the case of heterogeneous data settings, the different data distributions among clients usually lead to the divergence of learning targets, which is an essential challenge for federated learning. In this article, we propose a federated learning framework with a unified coding space, called FedUCS, for learning cross-client uniform coding rules to solve the problem of divergent targets among multiple clients due to heterogeneous data. A cross-client coordinator co-trained by multiple clients is used as a criterion of the coding space to supervise all clients coding to a uniform space, which is the significant contribution of this article. Furthermore, in order to appropriately retain historical information and avoid forgetting previous knowledge, a partial memory mechanism is applied. Moreover, in order to further enhance the distinguishability of the unified encoding space, supervised contrastive learning is used to avoid the intersection of the encoding spaces belonging to different categories. A series of experiments are performed to verify the effectiveness of the proposed method in a federated learning setting with heterogeneous data.
Read full abstract