Abstract

Few-shot class incremental learning is a challenging problem in the field of machine learning. It necessitates models to gradually learn new knowledge from a few samples while retaining the knowledge of old classes. Nevertheless, the limited data available for new classes not only leads to significant overfitting problems but also exacerbates the issue of catastrophic forgetting in the incremental learning process. To address the above two issues, we propose a novel framework named Grassmann Manifold and Information Entropy for Few-Shot Class Incremental Learning(GMIE-FSCIL). Different from existing methods that model parameters on the Euclidean space, our method optimizes the incremental learning network on the Grassmann manifold. More specifically, we incorporate the acquired knowledge of each class on the Grassmann manifold, ensuring the preservation of their inherent geometric properties by Grassmann Metric Learning(GML) module. Acknowledging the interconnected relationships of knowledge, with information entropy we create a neighborhood graph on Grassmann manifold to maintain inter-class structural information by Graph Information Preserving(GIP) module, thus mitigating catastrophic forgetting of learned knowledge. In our evaluation of CIFAR100, miniImageNet, and CUB200 datasets, we achieved significant improvements in terms of Avg compared to mainstream methods, with at least 2.72%, 1.21%, and 1.27% increases.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call