Knowledge distillation (KD) has been one of the most potential methods to implement a lightweight detector, which plays a significant role in satellite in-orbit processing and unmanned aerial vehicle tracking. However, existing distillation paradigms exhibit limited accuracy in detecting arbitrary-oriented objects represented with rotated bounding boxes in remote sensing images. This issue is attributed to two aspects: (i) boundary discontinuity localization distillation, caused by angle periodicity of rotated bounding boxes, and (ii) spatial ossified feature distillation, induced by orientation-agnostic knowledge transitive regions, both of which contribute to ambiguous orientation estimation of objects. To address these issues, we propose an effective KD method called Orientation Distillation (OD) via anti-ambiguous spatial transformation, which consists of two modules. (i) Anti-ambiguous Location Prediction (ALP) module reformulates the regression transformation between teacher–student bounding boxes as Gaussian distributions fitting procedure. These distributions with distilled potential are optimized to accurately localize objects with the aid of boundary continuity cost. (ii) Orientation-guided Feature Calibration (OFC) module employs a learnable affine matrix to augment fixed CNN sampling grid into a spatially remapped one, which bridges between the multi-scale feature of teacher and student for effectively delivering the refined oriented awareness within adaptively distillation regions. Overall, OD customizes the spatial transformation of bounding box representation and sampling grid to transfer anti-ambiguous orientation knowledge, and significantly improves the performance of lightweight detectors upon non-axially arranged objects. Extensive experiments on multiple datasets demonstrate that our plug-and-play distillation framework achieves state-of-the-art performance. Codes are available at https://github.com/Molly6/OD.
Read full abstract