Abstract

In this paper, we propose a new recursive framework for camera resectioning and apply it to off-line video-based augmented reality. Our method is based on an unscented particle filter and an independent Metropolis–Hastings chain, which deal with nonlinear dynamic systems without local linearization, and lead to more accurate results than other nonlinear filters. The proposed method has some desirable properties for camera resectioning: Since it does not rely on erroneous linear solutions, initialization problems do not occur, in contrast to the previous resectioning methods. Jittering error can be reduced by considering consistency and coherency between adjacent frames in our recursive framework. Our method is fairly accurate comparable to nonlinear optimization methods, which in general have higher levels of computation and complexity. As a result, the proposed algorithm outperforms the standard camera resectioning algorithm. We verify the effectiveness of our method through several experiments using synthetic and real image sequences comparing the estimation performance with other linear and nonlinear methods.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.