Abstract

Class incremental learning requires models to learn new-class knowledge without forgetting old-class information. As a natural solution, the parallel one-class framework (POC) has attracted extensive attention. However, POC is prone to suffer the problem of lacking comparability between classifiers due to their inconsistent output distributions. To address this drawback, we propose an incremental learning method based on Identically Distributed Parallel One-class Classifiers (IDPOC). The core of IDPOC is a novel one-class classifier with Gaussian distributed output, referred to as Deep-SVD2D. Deep-SVD2D encourages the distribution of sample representations to follow the standard multivariate Gaussian. Consequently, the distance between the representation and its class center will approximately follow a chi-square distribution with some freedom degree. IDPOC further eliminates the freedom degree to ensure the output of all classifiers to follow an identical distribution, thus enhancing the comparability between different classifiers. We evaluate IDPOC on four popular benchmarks: MNIST, CIFAR10, CIFAR100, and Tiny-ImageNet. The experimental results show that IDPOC achieves state-of-the-art performance, e.g., it outperforms the best baseline by 1.6% and 2.8% on two large-scale benchmarks of CIFAR100 and Tiny-ImageNet, respectively 11The source code is publicly available at https://github.com/SunWenJu123/IDPOC..

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.