Abstract

Few-shot class-incremental learning (FSCIL) is crucial and practical for artificial intelligence in the real world, which learns novel classes incrementally from few samples without forgetting the previously learned classes. However, FSCIL confronts two significant challenges: “catastrophic forgetting” and “overfitting new.” We focus on convolutional neural network (CNN)-based FSCIL and propose a human cognition-inspired FSCIL method, in which the knowledge of novel classes is learned under the guidance of the previously learned knowledge. Specifically, we learn a discriminative and generalized CNN feature extractor from the base classes in the first task. We generate the representations of base and novel classes in unified feature space without training on novel classes, thus avoiding “forgetting old.” For the novel classes in long sequential tasks, beyond the representation generation, we enhance the representation by exploring the correlations with the previously learned classes to alleviate overfitting new and ensure that the novel classes adapt to the feature space. Experimental results show that our proposed method achieves very competitive results on MiniImageNet and CIFAR-100 datasets.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.