In data-scarcity scenarios, few-shot object detection (FSOD) methods exhibit a notable advantage in alleviating the over-fitting problem. Currently, research on FSOD in the field of remote sensing is advancing rapidly and FSOD methods based on the fine-tuning paradigm have initially displayed their excellent performance. However, existing fine-tuning methods often encounter classification confusion issues. This is potentially because of the shortage of explicit modeling for transferable common knowledge and the biased class distribution, especially for fine-grained targets with higher inter-class similarity and intra-class variance. In view of this, we first propose a decoupled self-distillation (DSD) method to construct class prototypes in two decoupled feature spaces and measure inter-class correlations as soft labels or aggregation weights. To ensure a robust set of class prototypes during the self-distillation process, we devise a feature filtering module (FFM) to preselect high-quality class representative features. Furthermore, we introduce a progressive prototype calibration module (PPCM) with two steps, compensating the base prototypes with the prior base distribution and then calibrating the novel prototypes with adjacent calibrated base prototypes. Experiments on MAR20 and customized SHIP20 datasets have demonstrated the superior performance of our method compared to other existing advanced FSOD methods, simultaneously confirming the effectiveness of all proposed components.
Read full abstract