Abstract

Deep neural networks achieve significant progress in semantic segmentation but still suffer from the catastrophic forgetting problem, i.e., networks will forget old classes as they learn new ones. In this article, we propose an effective class-incremental segmentation framework without storing old data. Specifically, to alleviate the issue of catastrophic forgetting, we present two important modules, i.e., the Pixel-level Feature Generation (PFG) module, and the Task-wise Knowledge distillation (TKD) module. The PFG module is designed to constantly generate any number of features of the old classes to keep the old memory. The PFG module is the first attempt to use the generative method in class-incremental segmentation of aerial images, and it abandons the previous image generation approach but to generate pixel-level features, which is more suitable for the segmentation task. Meanwhile, the proposed TKD module is specially designed for class incremental tasks, and it only compares classes in the same learning step (task), thus avoiding the squeezing of new classes to old classes when the output is normalized (softmax), making distillation more effective. Sufficient experiments show that our method is remarkably effective and achieves more than 4.5% gains compared with state-of-the-art methods, and more than 13% compared with baselines, on all learning conditions. The ablation studies show that the PFG module and the TKD module are both indispensables. Besides, the proposed framework can be well combined with any existing class incremental learning method to achieve better performance.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call