Abstract

Few-Shot Class-Incremental Learning (FSCIL)presents an extension of the Class Incremental Learning (CIL)problem where a model is faced with the problem of data scarcity while addressing the Catastrophic Forgetting (CF)problem. This problem remains an open problem because all recent works are built upon the Convolutional Neural Networks (CNNs)performing sub-optimally compared to the transformer approaches. Our paper presents Robust Transformer Approach (ROBUSTA)built upon the Compact Convolutional Transformer (CCT). The issue of overfitting due to few samples is overcome with the notion of the stochastic classifier, where the classifier's weights are sampled from a distribution with mean and variance vectors, thus increasing the likelihood of correct classifications, and the batch-norm layer to stabilize the training process. The issue of CFis dealt with the idea of delta parameters, small task-specific trainable parameters while keeping the backbone networks frozen. A non-parametric approach is developed to infer the delta parameters for the model's predictions. The prototype rectification approach is applied to avoid biased prototype calculations due to the issue of data scarcity. The advantage of ROBUSTAis demonstrated through a series of experiments in the benchmark problems where it is capable of outperforming prior arts with big margins without any data augmentation protocols.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call