Abstract

Multi-modal image matching is crucial in aerospace applications because it can fully exploit the complementary and valuable information contained in the amount and diversity of remote sensing images. However, it remains a challenging task due to significant non-linear radiometric, geometric differences, and noise across different sensors. To improve the performance of heterologous image matching, this paper proposes a normalized self-similarity region descriptor to extract consistent structural information. We first construct the pointwise self-similarity region descriptor based on the Euclidean distance between adjacent image blocks to reflect the structural properties of multi-modal images. Then, a linear normalization approach is used to form Modality Independent Region Descriptor (MIRD), which can effectively distinguish structural features such as points, lines, corners, and flat between multi-modal images. To further improve the matching accuracy, the included angle cosine similarity metric is adopted to exploit the directional vector information of multi-dimensional feature descriptors. The experimental results show that the proposed MIRD has better matching accuracy and robustness for various multi-modal image matching than the state-of-the-art methods. MIRD can effectively extract consistent geometric structure features and suppress the influence of SAR speckle noise using non-local neighboring image blocks operation, effectively applied to various multi-modal image matching.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.