Abstract

Federated Continual Learning (FCL) approaches exist two major problems of the probability bias and the imbalance in parameter variations. These two problems lead to catastrophic forgetting of the network in the FCL process. Therefore, this paper proposes a novel FCL framework, Federated Probability Memory Recall (FedPMR), to mitigate the probability bias problem and the imbalance in parameter variations. Firstly, for the probability bias problem, this paper designs the Probability Distribution Alignment (PDA) module, which consolidates the memory of old probability experience. Specifically, PDA maintains a replay buffer and uses the probability memory stored in the buffer to correct the offset probabilities of the previous tasks during the two-stage training. Secondly, to alleviate the imbalance in parameter variations, this paper designs the Parameter Consistency Constraint (PCC) module, which constrains the magnitude of neural weight changes for previous tasks. Concretely, PCC applies a set of adaptive weights to subsets of the regularization term that constrains parameter changes, forcing the current model to be sufficiently close to the past model in parameter space distance. Experiments with various levels of task similitude across clients demonstrate that our technique establishes the new state-of-the-art performance when compared to previous FCL approaches.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.