Abstract

We present methods for turning pair-wise registration algorithms into drift-free trackers. Such registration algorithms are abundant, but the simplest techniques for building trackers on top of them exhibit either limited tracking range or drift. Our algorithms maintain the poses associated with a number of key frames, building a view-based appearance model that is used for tracking and refined during tracking. The first method we propose is batch oriented and is ideal for offline tracking. The second is suited for recovering egomotion in large environments where the trajectory of the camera rarely intersects itself, and in other situations where many views are necessary to capture the appearance of the scene. The third method is suitable for situations where a few views are sufficient to capture the appearance of the scene, such as object-tracking. We demonstrate the techniques on egomotion and head-tracking examples and show that they can track for an indefinite amount of time without accumulating drift.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.