Abstract

In this paper, we propose a new recursive framework for camera resectioning and apply it to off-line video-based augmented reality. Our method is based on an unscented particle filter and an independent Metropolis–Hastings chain, which deal with nonlinear dynamic systems without local linearization, and lead to more accurate results than other nonlinear filters. The proposed method has some desirable properties for camera resectioning: Since it does not rely on erroneous linear solutions, initialization problems do not occur, in contrast to the previous resectioning methods. Jittering error can be reduced by considering consistency and coherency between adjacent frames in our recursive framework. Our method is fairly accurate comparable to nonlinear optimization methods, which in general have higher levels of computation and complexity. As a result, the proposed algorithm outperforms the standard camera resectioning algorithm. We verify the effectiveness of our method through several experiments using synthetic and real image sequences comparing the estimation performance with other linear and nonlinear methods.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call