Abstract

Nowadays, the Convolutional Neural Network is successfully applied to the images object detection. When new classes of object emerges, it is popular to adapt the convolutional neural network based detection model through a retraining process with the new classes of samples. Unfortunately, the adapted model can only detect the new classes of objects, but cannot identify the old classes of objects, which is called catastrophic forgetting, also occurring in incremental classification tasks. Knowledge distillation has achieved good results in incremental learning for classification tasks. Due to the dual tasks within object detection, object classification and location at the same time, a straightforward migration of knowledge distillation method cannot provide a satisfactory result in incremental learning for object detection tasks. Hence, this paper propose a new knowledge distillation for incremental object detection, which introduces a new object detection distillation loss, a loss not only for classification results but also for location results of the predicted bounding boxes, not only for all final detected regions of interest but also for all intermediate regions proposal. Furthermore, to avoid forgetting learned knowledge from old datasets, this paper not only employs hint learning to retain the characteristic information of the initial model, but also innovatively uses confidence loss to extract the confidence information of the initial model. A series of experiment results on the PASCAL VOC 2007 dataset verify the effectiveness of the proposed method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call