Abstract

Class incremental learning (CIL) can learn new classes continuously by updating the model rather than retraining a model from scratch with all seen classes like traditional offline learning, therefore, CIL is more suitable for classification in dynamic environments, where new classes are captured progressively. However, the key knowledge of old classes will be lost due to the update mode of CIL, leading to the catastrophic forgetting (CF) problem. In this paper, a novel CIL method with Kullback-Leibler constraint and multi-strategy exemplar selection (CIL-KLMES) is proposed for classification based on max-margin factor analysis (MMFA) model. To handle the CF problem, CIL-KLMES imposes a Kullback-Leibler (KL) divergence term on the important parameters when updating the model to restrict the parameters' distribution to be similar, thus preventing the updated model from deviating too much from the previous model and preserving the knowledge of old classes. Moreover, CIL-KLMES selects a few representative exemplars from old classes based on the robust description of data distribution and classification decision boundary. By replaying representative exemplars to update the model together with new class data, the key knowledge of old classes can be further preserved. Therefore, the CF problem can be alleviated sufficiently. Experimental results demonstrate the effectiveness of CIL-KLMES.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call