Abstract

Class-incremental learning (CIL) has been widely stud-ied under the setting of starting from a small number of classes (base classes). Instead, we explore an understud-ied real-world setting of CIL that starts with a strong model pre-trained on a large number of base classes. We hypoth-esize that a strong base model can provide a good repre-sentation for novel classes and incremental learning can be done with small adaptations. We propose a 2-stage training scheme, i) feature augmentation - cloning part of the backbone and fine-tuning it on the novel data, and ii) fusion - combining the base and novel classifiers into a unified classifier. Experiments show that the proposed method sig-nificantly outperforms state-of-the-art CIL methods on the large-scale ImageNet dataset (e.g. + 10% overall accuracy than the best). We also propose and analyze understudied practical CIL scenarios, such as base-novel overlap with distribution shift. Our proposed method is robust and gen-eralizes to all analyzed CIL settings.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.