Federated learning is a distributed learning approach aimed at preserving user’s data privacy, while incremental learning is an adaptive machine learning method that enables continuous learning of new data. The combination of these two approaches into Federated Incremental Learning (FIL) algorithms brings together their respective advantages. However, the existing federated incremental learning still faces two main challenges: (1) catastrophic forgetting of previous knowledge by local models when adapting to incremental tasks, and (2) limited capability of the server to capture critical features during federated aggregation. To address these challenges, this paper proposes a federated incremental learning algorithm based on Topological Data Analysis (TDA). Firstly, the algorithm extracts topological features from the input information and designs a Topological Stability Loss (TSL) to mitigate catastrophic forgetting of previous knowledge by local models. Secondly, a feature attention mechanism is utilized to select appropriate attention weights for each local model, enhancing the recognition performance of the global model. The experimental results on publicly available datasets, namely CIFAR10, CIFAR100, and ImageNet, demonstrate that the proposed approach in this paper achieves global accuracies of 67.23%, 65.75%, and 62.41% respectively for the federated incremental learning model. These accuracies surpass the existing Icarl incremental algorithm by improvements of 3.21%, 2.87%, and 1.34% respectively. Therefore, the algorithm presented in this paper achieves better performance, indicating its superiority over existing methods.
Read full abstract