Abstract

Class-incremental learning is a model learning technique that can help classification models incrementally learn about new target classes and realize knowledge accumulation. It has become one of the major concerns of the machine learning and classification community. To overcome the catastrophic forgetting that occurs when the network is trained sequentially on a multi-class data stream, a double consolidation class-incremental learning (DCCIL) method is proposed. In the incremental learning process, the network parameters are adjusted by combining knowledge distillation and elastic weight consolidation, so that the network can better maintain the recognition ability of the old classes while learning the new ones. The incremental learning experiment is designed, and the proposed method is compared with the popular incremental learning methods such as EWC, LwF, and iCaRL. Experimental results show that the proposed DCCIL method can achieve better incremental accuracy than that of the current popular incremental learning algorithms, which can effectively improve the expansibility and intelligence of the classification model.

Highlights

  • Incremental learning [1], known as lifelong learning [2] or continuous learning, aims to enable learning models to continue learning just like human beings

  • We propose a novel Class-IL method, double consolidation class-incremental learning (DCCIL), that allows convolutional neural networks to learn in such a way: only new class data are used to train the network while preserving the original capabilities

  • To overcome the catastrophic forgetting that often occurs in neural networks during continuous learning, this paper proposes a new class-incremental learning method named DCCIL

Read more

Summary

Introduction

Incremental learning [1], known as lifelong learning [2] or continuous learning, aims to enable learning models to continue learning just like human beings. The way the human brain learns is incremental, e.g., if a child gets to know the tiger and lion in the zoo, he can retain this knowledge and learn more new species such as the dolphin and seal in the future. Incremental learning represents a dynamic model learning technique that can help models continuously learn new knowledge and realize knowledge accumulation. Incremental learning has to overcome catastrophic forgetting, namely, to learn new knowledge without forgetting the old knowledge. According to the degree of difficulty, incremental learning can be divided into three categories [4]: The associate editor coordinating the review of this manuscript and approving it for publication was Sudhakar Radhakrishnan

Objectives
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.