Abstract

Object detection plays a important role within the field of remote sensing, boasting significant applications including intelligent monitoring and urban planning. However, traditional models are constrained by predefined classes and encounter a challenge known as catastrophic forgetting when attempting to learn new classes post-deployment. To address this problem, we propose a novel Instance-aware Distillation approach for Class-incremental Object Detection (IDCOD). Our approach capitalizes on the teacher model, a model from a previous stage, to serve as a guide during the training of the new model on novel data. This methodology facilitates the gradual acquisition of knowledge about new classes while simultaneously preserving the performance achieved on previously learned classes. Instance-aware distillation with masks of old and new classes aims to reduce forgetting and impact on new classes. Furthermore, we design a pseudo-label module to expand old class training data. Experiments on the challenging DOTA dataset, DIOR dataset, RTDOD dataset and PASCAL VOC dataset show that our method effectively detects old classes, incrementally detects new classes, and mitigates catastrophic forgetting.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call