Abstract

Currently, unmanned aerial vehicle (UAVs) are used in many fields, among which UAV localization is the key to UAV autonomous flight capability, and how to locate UAVs in the absence of GNSS signals is a difficult task. In recent years, deep learning has developed rapidly, among which convolutional neural networks are widely used in visual images. In this paper, we apply the convolutional neural network DenseNet to the task of matching between UAV images and satellite remote sensing images for visual localization of UAVs. We make some improvements to address the problems in practice. We propose a quality-aware template matching method based on adaptive adjustment of convolutional feature weights to enhance the feature extraction capability of the model, and introduce a fusion mechanism of multi-scale feature layers. The feature maps of UAV images and satellite images are quality scored so as to locate the position of UAV in satellite images. Qualitative and quantitative experiments are conducted, and the results show the effectiveness and superiority of the method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call