Abstract

In this paper, we propose a method for estimating the camera pose for an environment in which the intrinsic camera parameters change dynamically. In video see-through augmented reality (AR) technology, image-based methods for estimating the camera pose are used to superimpose virtual objects onto the real environment. In general, video see-through-based AR cannot change the image magnification that results from a change in the camera׳s field-of-view because of the difficulty of dealing with changes in the intrinsic camera parameters. To remove this limitation, we propose a novel method for simultaneously estimating the intrinsic and extrinsic camera parameters based on an energy minimization framework. Our method is composed of both online and offline stages. An intrinsic camera parameter change depending on the zoom values is calibrated in the offline stage. Intrinsic and extrinsic camera parameters are then estimated based on the energy minimization framework in the online stage. In our method, two energy terms are added to the conventional marker-based method to estimate the camera parameters: reprojection errors based on the epipolar constraint and the constraint of the continuity of zoom values. By using a novel energy function, our method can accurately estimate intrinsic and extrinsic camera parameters. We confirmed experimentally that the proposed method can achieve accurate camera parameter estimation during camera zooming.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call