Abstract

Federated incremental learning best suits the changing needs of common Federal Learning (FL) tasks. In this area, the large sample client dramatically influences the final model training results, and the unbalanced features of the client are challenging to capture. In this paper, a federated incremental learning framework is designed; firstly, part of the data is preprocessed to obtain the initial global model. Secondly, to help the global model to get the importance of the features of the whole sample of each client, and enhance the performance of the global model to capture the critical information of the feature, channel attention neural network model is designed on the client side, and a federated aggregation algorithm based on the feature attention mechanism is designed on the server side. Experiments on standard datasets CIFAR10 and CIFAR100 show that the proposed algorithm accuracy has good performance on the premise of realizing incremental learning.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.