Abstract

Few shot class incremental learning (FSCIL) is an extremely challenging but valuable problem in real-world applications. When faced with novel few shot tasks in each incremental stage, it should take into account both catastrophic forgetting of old knowledge and overfitting of new categories with limited training data. In this paper, we propose an efficient prototype replay and calibration (EPRC) method with three stages to improve classification performance. We first perform effective pre-training with rotation and mix-up augmentations in order to obtain a strong backbone. Then a series of pseudo few shot tasks are sampled to perform meta-training, which enhances the generalization ability of both the feature extractor and projection layer and then helps mitigate the over-fitting problem of few shot learning. Furthermore, an even nonlinear transformation function is incorporated into the similarity computation to implicitly calibrate the generated prototypes of different categories and alleviate correlations among them. Finally, we replay the stored prototypes to relieve catastrophic forgetting and rectify prototypes to be more discriminative in the incremental-training stage via an explicit regularization within the loss function. The experimental results on CIFAR-100 and miniImageNet demonstrate that our EPRC significantly boosts the classification performance compared with existing mainstream FSCIL methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call