Abstract

This study presents a radar-optical fusion detection method for unmanned aerial vehicles (UAVs) in maritime environments. Radar and camera technologies are integrated to improve the detection capabilities of the platforms. The proposed method involves generating regions of interest (ROI) by projecting radar traces onto optical images through matrix transformation and geometric centroid registration. The generated ROI are matched with YOLO detection boxes using the intersection-over-union (IoU) algorithm, enabling radar-optical fusion detection. A modified algorithm, called SPN-YOLOv7-tiny, is developed to address the challenge of detecting small UAV targets that are easily missed in images. In this algorithm, the convolutional layers in the backbone network are replaced with a space-to-depth convolution, and a small object detection layer is added. In addition, the loss function was replaced with a normalized weighted distance loss function. Experimental results demonstrate that compared to the original YOLOv7-tiny method, SPN-YOLOv7-tiny achieves an improved mAP@0.5 (mean average precision at an IoU threshold of 0.5) from 0.852 to 0.93, while maintaining a high frame rate of 135.1 frames per second. Moreover, the proposed radar-optical fusion detection method achieves an accuracy of 96.98%, surpassing the individual detection results of the radar and camera. The proposed method effectively addresses the detection challenges posed by closely spaced overlapping targets on a radar chart.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call