Abstract

Arbitrarily oriented object detection in remote sensing images is a challenging task. At present, most of the algorithms are dedicated to improving the detection accuracy, while ignoring the detection speed. In order to further improve the detection accuracy and provide a more efficient model for scenes that require real-time detection, we propose an improved YOLOv4-CSP network for rotating object detection in remote sensing images. There are mainly three contributions in our approach. First, we design a new bounding box regression loss function, which is distance and angle-intersection over union (DAIoU). This loss function is formed by adding a distance penalty term and an angle penalty term on the basis of intersection over union. It is suitable for arbitrarily oriented object detection networks. Second, we develop an adaptive angle setting method for anchors based on k-means clustering algorithm. This method can obtain representative angles for better representing the distribution of the angle set. By assigning representative angles to all anchors for training, it is beneficial to reduce the complexity of the network to adjust anchors to GT bounding boxes. Finally, we improve the YOLOV4-CSP network and make it suitable for detection scenarios based on rotated anchors by applying rotation transformations. We combine the above methods and use the final network to perform the detection task. The experimental results on three remote sensing datasets, i.e., HRSC2016, UCAS-AOD, and SSDD+, validate the effectiveness of our method. Comparison results with state-of-the-arts methods demonstrate that our method can be used to significantly improve the detection accuracy with a higher detection speed.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call