Abstract

In recent years, a 3D reconstruction based on structure from motion (SFM) has attracted much attention from the communities of computer vision and graphics. It is well known that the speed and quality of SFM systems largely depend on the technique of feature tracking. If a big volume of image data is inputted for SFM, the speed of this SFM system would become very slow. And, this problem becomes severer for large-scale scenes, which typically needs to capture several thousands of images to recover the point-cloud model of the scene. However, none of the existing methods fully addresses the problem of fast feature tracking. Brute force matching is capable of producing correspondences for small-scale scenes but often getting stuck in repeated features. Hashing matching can only deal with middle-scale scenes and is not capable of large-scale scenes. In this paper, we propose a new feature tacking method working in a parallel manner rather than in a single thread scheme. Our method consists of steps of keypoint detection, descriptor computing, descriptor matching by parallel $k$ -nearest neighbor (Parallel-KNN) search, and outlier rejecting. This method is able to rapidly match a big volume of keypoints and avoids to consume high computation time, then yielding a set of correct correspondences. We demonstrate and evaluate the proposed method on several challenging benchmark datasets, including those with highly repeated features, and compare to the state-of-the-art methods. The experimental results indicate that our method outperforms the compared methods in both efficiency and effectiveness.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.