Abstract

When the training data is provided in the streamed or staged manner, the traditional deep learning methods have severe performance degradation on previous tasks or classes due to the updating of the parameters that have been well learned on old data. Incremental learning aims to avoid the catastrophic forgetting problem which the traditional deep learning framework have suffered from, and enable the model to learn in a continuous way. However, how to make a good trade-off between plasticity and stability, so as to perform well both for the new and the old classes is still an open problem for incremental learning. In this paper, we proposed a new incremental learning method called local structure constrained network(LSC-Net), which employs local relation constraints rather than strong global similarity constraints when updating the model. With the attentively designed inter-class and intra-class restriction, the proposed LSC-Net can make samples of the same class be more compact in the feature space, while expand the distance between samples of different classes. And LSC-Net adopts HM-SoftMax loss function rather than cross-entropy loss function in the classifier, so it can alleviate the effect of data imbalance. Experimental results on CIFAR-100 and ImageNet-100 show that LSC-Net can effectively overcame catastrophic forgetting and obtain competitive results that surpass state-of-the-art methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call