Abstract

With the advancement and widespread adoption of deep learning models, there has been a growing interest in class incremental learning. This approach aims to continuously learn new classes while retaining the recognition and memory capabilities for previously learned classes within an open and dynamic environment. The primary focus of class incremental learning is on maintaining the ability to learn new classes while mitigating catastrophic forgetting, thus achieving a better balance between stability and adaptability. To address this challenge, we propose an innovative method for incremental class learning that leverages dynamically representations to facilitate more efficient incremental class learning, preserving previously acquired features while adapting to new ones and effectively reducing catastrophic forgetting. Furthermore, we introduce a feature augmentation mechanism to significantly enhance the model's classification performance when incorporating new classes. This approach ensures efficient learning of both old and new classes without compromising the effectiveness of previous models. We conducted extensive experiments on two classes incremental learning benchmarks, consistently demonstrating significant performance advantages over other methods.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.