Abstract

Deep learning-based remote sensing object detectors are usually composed of two branches: classification and localization. Recently proposed object detectors often follow the pipeline that classification and localization branches share the same feature maps, which leads to a strong coupling relationship between them. However, when tackling remote sensing images, this strong coupling relationship may impair the performance of the detectors because the top-view perspective of remote sensing images may result in conflicts between classification and location branches. To address this issue, we propose a decoupled classification localization network (DCL-Net) by considering the different characteristics between the two branches. Two modules are developed to suppress the strong coupling: receptive field aggregation module (RFAM) and bottom-up path aggregation module (PAM). For the classification branch, RFAM can learn the relationship between objects and context information by simulating the human receptive field and improve the robustness of the classification branch to rotational distortions. For the localization branch, PAM can enhance the entire feature hierarchy by transferring the rich detailed information of low-level features, which helps the detector to achieve precise bounding box regression. Compared with existing methods, the major contribution of DCL-Net is that the independence of the classification and localization branches can be significantly enhanced, which may be beneficial to the detection accuracy for the objects in remote sensing images. Experiments on public data sets validate the effectiveness of our detector.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.