Abstract

Given a model well-trained with a large-scale base dataset, few-shot class-incremental learning (FSCIL) aims at incrementally learning novel classes from a few labeled samples by avoiding overfitting, without catastrophically forgetting all encountered classes previously. Currently, semi-supervised learning technique that harnesses freely available unlabeled data to compensate for limited labeled data can boost the performance in numerous vision tasks, which heuristically can be applied to tackle issues in FSCIL, i.e., the semi-supervised FSCIL (Semi-FSCIL). So far, very limited work focuses on the Semi-FSCIL task, leaving the adaptability issue of semi-supervised learning to the FSCIL task unresolved. In this article, we focus on this adaptability issue and present a simple yet efficient Semi-FSCIL framework named uncertainty-aware distillation with class-equilibrium (UaD-ClE), encompassing two modules: uncertainty-aware distillation (UaD) and class equilibrium (ClE). Specifically, when incorporating unlabeled data into each incremental session, we introduce the ClE module that employs a class-balanced self-training (CB_ST) to avoid the gradual dominance of easy-to-classified classes on pseudo-label generation. To distill reliable knowledge from the reference model, we further implement the UaD module that combines uncertainty-guided knowledge refinement with adaptive distillation. Comprehensive experiments on three benchmark datasets demonstrate that our method can boost the adaptability of unlabeled data with the semi-supervised learning technique in FSCIL tasks. The code is available at https://github.com/yawencui/UaD-ClE.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call