Abstract
Imagery registration is a fundamental step, which greatly affects later processes in image mosaic, multi-spectral image fusion, digital surface modelling, etc., where the final solution needs blending of pixel information from more than one images. It is highly desired to find a way to identify registration regions among input stereo image pairs with high accuracy, particularly in remote sensing applications in which ground control points (GCPs) are not always available, such as in selecting a landing zone on an outer space planet. In this paper, a framework for localization in image registration is developed. It strengthened the local registration accuracy from two aspects: less reprojection error and better feature point distribution. Affine scale-invariant feature transform (ASIFT) was used for acquiring feature points and correspondences on the input images. Then, a homography matrix was estimated as the transformation model by an improved random sample consensus (IM-RANSAC) algorithm. In order to identify a registration region with a better spatial distribution of feature points, the Euclidean distance between the feature points is applied (named the S criterion). Finally, the parameters of the homography matrix were optimized by the Levenberg–Marquardt (LM) algorithm with selective feature points from the chosen registration region. In the experiment section, the Chang’E-2 satellite remote sensing imagery was used for evaluating the performance of the proposed method. The experiment result demonstrates that the proposed method can automatically locate a specific region with high registration accuracy between input images by achieving lower root mean square error (RMSE) and better distribution of feature points.
Highlights
Image registration refers to aligning two image point sets that share the same scene in a common coordination system
With the input of a stereo image pair, Affine scale-invariant feature transform (ASIFT) is applied to generate a vast amount of correspondences; IM-Random sample consensus (RANSAC) is used for removing incorrect and low accuracy matching feature points to create a set of inliers; an automated region selection mechanism named the S criterion is further developed in the algorithm, combined with an optimization process to locate the image registration region and obtain the transformation model
We proposed a novel framework with various techniques to identify overlapping regions with high registration accuracy for a stereo image pair
Summary
Image registration refers to aligning two image point sets that share the same scene in a common coordination system. It is unnecessary to perform a global registration, since a small area with high registration accuracy within the preselected landing area is large enough for landing on the moon surface Inspired by this practical demand, we developed a local registration method in this paper to identify a region between an input image pair with higher local registration accuracy than global registration has. Toward achieving the maximum local registration accuracy between the reference and a sensed image in the proposed method, we firstly take advantage of the feature extraction capacity of ASIFT to obtain sufficient feature points and coarse correspondences (tentative correspondences), the mismatched and low accuracy correspondences among the tentative correspondences are eliminated by the improved RANSAC (IM-RANSAC) algorithm, which is proposed in this paper to gain high accuracy inliers.
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.