Abstract
Due to parallax and inadequate key points, it is difficult to stitch UAV images that are not rich in structure. In this paper, this challenge is solved for the first time. Global matching is judged along with local similarity by a trained transformer, which provides the possibility of finding plenty of key points in low-feature regions. A new point matching constraint is designed based on the scores from the transformer. Line protection and distortion resistance are also used in local correction to alleviate global aberrations. The experiment shows that our method outperforms four state-of-the-art algorithms significantly, which reduces the position error by half in stitching inconspicuous features such as woodland, bare land, and river.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have