Federated Learning (FL) has emerged as a promising paradigm for collaborative model training across distributed clients while preserving data privacy. However, prevailing FL approaches aggregate the clients’ local models into a global model through multi-round iterative parameter averaging. This leads to the undesirable bias of the aggregated model towards certain clients in the presence of heterogeneous data distributions among the clients. Moreover, such approaches are restricted to supervised classification tasks and do not support unsupervised clustering. To address these limitations, we propose a novel one-shot FL approach called Federated Adaptive Resonance Theory (FedART) which leverages self-organizing Adaptive Resonance Theory (ART) models to learn category codes, where each code represents a cluster of similar data samples. In FedART, the clients learn to associate their private data with various local category codes. Under heterogeneity, the local codes across different clients represent heterogeneous data. In turn, a global model takes these local codes as inputs and aggregates them into global category codes, wherein heterogeneous client data is indirectly represented by distinctly encoded global codes, in contrast to the averaging out of parameters in the existing approaches. This enables the learned global model to handle heterogeneous data. In addition, FedART employs a universal learning mechanism to support both federated classification and clustering tasks. Our experiments conducted on various federated classification and clustering tasks show that FedART consistently outperforms state-of-the-art FL methods on data with heterogeneous distribution across clients.
Read full abstract