Abstract

Wearable computers afford a degree of mobility that makes tracking for augmented reality difficult. This paper presents a novel object-centric tracking architecture for presenting augmented reality media in spatial relationships to objects, regardless of the objects' positions or motions in the world. The advance this system provides is the ability to sense and integrate new features into its tracking database, thereby extending the tracking region automatically. A “lazy evaluation” of the structure from motion problem uses images obtained from a single calibrated moving camera and applies recursive filtering to identify and estimate the 3D positions of new features. We evaluate the performance of two filters; a classic Extended Kalman Filter (EKF) and a filter based on a Recursive-Average of Covariances (RAC). Some implementation issues and results are discussed in conclusion.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call