Abstract

Benefiting from the advances in object detection in remote sensing, detecting objects in images captured by drones has achieved promising performance in recent years. However, drone-view object detection in rainy weather conditions (Rainy DroneDet) remains a challenge, as small-sized objects blurred by rain streaks offer a little valuable information for robust detection. In this paper, we propose a Collaborative Deraining Network called “CoDerainNet”, which simultaneously and interactively trains a deraining subnetwork and a droneDet subnetwork to improve the accuracy of Rainy DroneDet. Furthermore, we propose a Collaborative Teaching paradigm called “ColTeaching”, which leverages rain-free features extracted by the Deraining Subnetwork and teaches the DroneDet Subnetwork such features, to remove rain-specific interference in features for DroneDet. Due to the lack of an existing dataset for Rainy DroneDet, we built three drone datasets, including two synthetic datasets, namely RainVisdrone and RainUAVDT, and one real drone dataset, called RainDrone. Extensive experiment results on the three rainy datasets show that CoDerainNet can significantly reduce the computational costs of state-of-the-art (SOTA) object detectors while maintaining detection performance comparable to these SOTA models.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call