Abstract

This paper presents a new two-step method for finding two-dimensional translational shifts with subpixel accuracy. This algorithm can measure subpixel shifts, even in images with few features, or in noisy images where many existing algorithms fail. In the first step of the algorithm (the integer part), the noise-robustness of the gradient correlation methods was improved by replacing central difference differentiators with Savitzky–Golay differentiators (SGDs). In the second step of the algorithm (the subpixel part), several modifications have been proposed to increase the accuracy and noise-robustness of phase-based methods for finding subpixel shifts. Moreover, two error metrics were introduced to quantify the output accuracy of the integer and subpixel parts of the algorithm.Comprehensive tests were conducted on 2400 standard 128 pixel × 128 pixel subimages subjected to synthetic shifts and rotations. Tests showed that the accuracy of the proposed method for finding translational shifts is of the order of a few ten-thousandths of a pixel, which is a substantial improvement over other state-of-the-art methods. For the rotation tests, the method outperformed comparable techniques. Furthermore, results showed that the proposed method generally provides better performance than other competing methods when images contained Gaussian or salt and pepper noise. The proposed method can be used in applications where high accuracy, robustness to noise, and/or computation efficiency are important.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.