Abstract

Catastrophic forgetting is a key challenge for class-incremental learning with deep neural networks, where the performance decreases considerably while dealing with long sequences of new classes. To tackle this issue, in this paper, we propose a new exemplar-supported representation for incremental learning (ESRIL) approach that consists of three components. First, we use memory aware synapses (MAS) pre-trained on the ImageNet to retain the ability of robust representation learning and classification for old classes from the perspective of the model. Second, exemplar-based subspace clustering (ESC) is utilized to construct the exemplar set, which can keep the performance from various views of the data. Third, the nearest class multiple centroids (NCMC) is used as the classifier to save the training cost of the fully connected layer of MAS when the criterion is met. Intensive experiments and analyses are presented to show the influence of various backbone structures and the effectiveness of different components in our model. Experiments on several general-purpose and fine-grained image recognition datasets have fully demonstrated the efficacy of the proposed methodology.

Highlights

  • In real-world applications, most of the image recognition systems are incremental [1], they should be updated continuously to adapt to the new data that are different from the existing ones

  • To tackle the aforementioned issues, in this paper, we propose an exemplar-supported representation for incremental learning (ESRIL) approach for both general-purpose and fine-grained image recognition

  • The main contributions of our paper are highlighted as follows: 1) We propose a novel incremental learning approach that incorporates three key components, i.e. i) memory aware synapses (MAS) [8] for representation learning and classification, ii) scalable exemplar-based subspace clustering (ESC) for selecting and ranking exemplars [18] to guarantee sufficient and diverse exemplars from each subspace, and iii) the nearest class multiple centroids (NCMC) classifier for effective classification to save the training time and to reduce the impact of class imbalance between old and new classes

Read more

Summary

INTRODUCTION

In real-world applications, most of the image recognition systems are incremental [1], they should be updated continuously to adapt to the new data that are different from the existing ones. For model-based methods [3]–[12], they update the weight parameters by using specific learning algorithms or certain defined loss functions These methods cannot work well in long sequences of new classes or tasks, due mainly to the omitting of as the old data [16]. The main contributions of our paper are highlighted as follows: 1) We propose a novel incremental learning approach that incorporates three key components, i.e. i) memory aware synapses (MAS) [8] for representation learning and classification, ii) scalable exemplar-based subspace clustering (ESC) for selecting and ranking exemplars [18] to guarantee sufficient and diverse exemplars from each subspace, and iii) the nearest class multiple centroids (NCMC) classifier for effective classification to save the training time and to reduce the impact of class imbalance between old and new classes.

RELATED WORK
MAS-BASED REPRESENTATION LEARNING AND CLASSIFICATION
NEAREST CLASS MULTIPLE CENTROIDS CLASSIFIER
Findings
CONCLUSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call