Abstract

Geometric distortions and intensity differences always exist in multi-source optical satellite imagery, seriously reducing the similarity between images, making it difficult to obtain adequate, accurate, stable, and well-distributed matches for image registration. With the goal of solving these problems, an effective image matching method is presented in this study for multi-source optical satellite imagery. The proposed method includes three steps: feature extraction, initial matching, and matching propagation. Firstly, a uniform robust scale invariant feature transform (UR-SIFT) detector was used to extract adequate and well-distributed feature points. Secondly, initial matching was conducted based on the Euclidean distance to obtain a few correct matches and the initial projective transformation between the image pair. Finally, two matching strategies were used to propagate matches and produce more reliable matching results. By using the geometric relationship between the image pair, geometric correspondence matching found more matches than the initial UR-SIFT feature points. Further probability relaxation matching propagated some new matches around the initial UR-SIFT feature points. Comprehensive experiments on Chinese ZY3 and GaoFen (GF) satellite images revealed that the proposed algorithm performs well in terms of the number of correct matches, correct matching rate, spatial distribution, and matching accuracy, compared to the standard UR-SIFT and triangulation-based propagation method.

Highlights

  • Image matching is the process of finding corresponding points on multi-view images of the same region; these images can be acquired by different sensors at different times [1]

  • To ensure that the retained feature points can be evenly distributed, the number of feature points for each scale is pre=-defi×ned according to the scale coefficient, which can be calculated by scale-invariant feature transform (SIFT) [13]

  • If the distance between P and Qj was smaller than n, and their M-normalized correlation coefficients (NCCs) value was largest among all the feature points on the reference image, whose distances to Qj were smaller than n, and the value was larger than r, P and Qj were initially identified as a pair of matching points

Read more

Summary

Introduction

Image matching is the process of finding corresponding points on multi-view images of the same region; these images can be acquired by different sensors at different times [1]. Matching corresponding points between multi-source optical satellite images is difficult due to the geometric deformation caused by scale change, rotation, and view angle transformations, as well as the nonlinear intensity differences caused by different radiation resolutions, and the effects of the atmosphere and radiation noise [12]. Image matching methods, based on local invariant features, are widely used for multi-source optical satellite images due to their robustness for image scale transformation. The most representative image matching method is the scale-invariant feature transform (SIFT) algorithm [13,14]. 2017, 9, 1249 final mRaemtocthe Seesn.s.E20x1p7,e9r, i1m249ental results on a variety of optical satellite images showed that the pofr1o9posed method can largely improve the use of feature points and increase the number of correct matches, whichmwwheetihcrheodwmceoarrenemlaeorvgreeenleylvyiemndlpiysrtodrviisbetrutihbteeudtue.sde. The matches obtained in all steps were the Remote Sens. 2017, 9, 1249 final mRaemtocthe Seesn.s.E20x1p7,e9r, i1m249ental results on a variety of optical satellite images showed that the pofr1o9posed method can largely improve the use of feature points and increase the number of correct matches, whichmwwheetihcrheodwmceoarrenemlaeorvgreeenleylvyiemndlpiysrtodrviisbetrutihbteeudtue.sde. of feature points and increase the number of correct matches,

UR-SIFT Feature Extraction
Initial Matching
Propagation Matching
Geometric Correspondence Matching
Experiments and Analysis
Description of Experimental Datasets
November 2016 30 April 2017
Parameters in UR-SIFT Feature Extraction
Comparative Results and Analysis
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.