Abstract

The goal of robust parameter estimation is developing a model which can properly fit to data. Parameter estimation of a geometric model, in presence of noise and error, is an important step in many image processing and computer vision applications. As the random sample consensus (RANSAC) algorithm is one of the most well-known algorithms in this field, there have been several attempts to improve its performance. In this paper, after giving a short review on existing methods, a robust and efficient method that detects the gross outliers to increase the inlier to outlier ratio in a reduced set of corresponding image points is proposed. It has a new hypothesis and verification scheme which utilizes spatial relations between extracted corresponding points in two images. It can also be considered as a preprocessing step for RANSAC to improve the accuracy as well as the runtime of RANSAC in estimating the parameters of a geometric model (such as fundamental and homography matrices). Obviously, like almost all previous works for enhancing RANSAC's runtime, the proposed method does not use heavy and compilicated processes. Performance analysis is performed on a variety of standard challenging datasets for estimating the homography and fundamental matrix (as an applicable case used in the literature, especially in the state-of-the-art methods). The performance is also compared quantitatively to RANSAC, PROSAC, and SCRAMSAC robust estimators to demonstrate its superiority. Experimental results show that the proposed method removes about 50% of outliers in most cases and hence extremely reduces the required runtime of RANSAC, while improving its accuracy.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.