Abstract

Class incremental learning has been proposed and developed widely to solve the problem in which training data is continuously used to extend the model's knowledge. It has been applied in many fields, such as pattern recognition and computer vision. However, traditional incremental learning methods facing the challenge of catastrophic forgetting. To this end, we use a generative adversarial network to assist the model's class incremental learning. Apart from that, some methods employ old classifier output soft targets for generated pseudo-data to help classifier keep knowledge of old classes, but generated samples with soft target and new class samples with the explicit training labels will lead to asymmetric information problem between old and new classes during incremental learning. This paper presents a new knowledge distillation method by using multi-hinge cGAN, we use multi-hinge distillation transfer class-correlation information among all classes from discriminator to the classifier, and help classifier overcomes catastrophic forgetting. Experiments on MNIST and CIFAR10 show that our method is comparable to many state-of-art incremental learning methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call