Abstract

Federated learning (FL) is a promising and privacy-preserving distributed learning method that is widely deployed on edge devices. However, in practical applications, the data collected by edge devices exhibits temporal variations. This leads to the problem of FL models adapting to new data while forgetting knowledge from old data, resulting in a catastrophic forgetting issue. Continual learning methods can be used to address this problem. However, when deploying these methods in FL on edge devices, it is challenging to adapt to the limited resources and heterogeneous data of the deployed devices, which reduces the efficiency and effectiveness of federated continual learning (FCL). Therefore, this article proposes a resource-efficient heterogeneous FCL framework. This framework divides the global model into an adaptation part for new knowledge and a preservation part for old knowledge. The preservation part is used to address the catastrophic forgetting problem. Only the adaptation part is trained when learning new knowledge on a new task, reducing resource consumption. Additionally, the framework mitigates the impact of heterogeneous data through an aggregation method based on feature representation. Experimental results show that our method performs well in mitigating catastrophic forgetting in a resource-efficient manner.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call