Abstract

With the rapid growth of the Internet, it has become easy to obtain new data for many application domains. However, when adding new data to the current system of artificial neural networks (ANNs) to learn, it can cause the network to completely forget what it has learned before, which is called catastrophic forgetting. The main reason for these problems is the inability of ANNs to balance new classes with old ones. Therefore, to address the challenge of learning new knowledge while not suffering from catastrophic forgetting, some incremental learning algorithms have been proposed to alleviate. This paper proposes features that balance new classes with old classes by using angular distillation. And some exemplars from the old classes are retained to improve the performance on the old data. The effectiveness of our algorithm is demonstrated on CIFAR-100 dataset.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call