Abstract

Remote sensing image object detection is widely used in military investigations, disaster relief and urban traffic management. However, unlike ordinary images, remote sensing images are acquired from aerial photography, resulting in a variety of directions where targets are arranged. This situation leads to the poor detection accuracy of general object detection algorithms on remote sensing images. To address the problem that existing object detection algorithms have difficulty in detecting targets in remote sensing images with high accuracy, an improved YOLOv5 algorithm (Rotate-YOLOv5) was proposed for detecting arbitrary-oriented object in remote sensing images. Firstly, YOLOv5m was chosen as the baseline to build the network model, four types of movable targets were selected from the public dataset DOTA: plane, small vehicle, large vehicle and ship. And the dataset images were cropped to a size of 1024×1024 and preprocessed with mosaic data enhancements. And the anchor box size was determined by the adaptive anchor box filtering method. The long-edge definition method based on circular smoothing labels was used to achieve the rotation of the bounding box. The effect of angular periodicity on training was addressed by converting the regression problem into a classification problem. Finally, the CIoU loss was used as the loss function of the bounding box to improve the detection accuracy on the basis of ensuring the detection speed. The results show that the proposed algorithm achieves an improvement of 13.4% in mean average precision over YOLOv5. This algorithm can improve the accuracy of remote sensing image object detection.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.