Abstract

Applications such as robotics and augmented reality (AR) require 3D tracking of rigid objects. In robotic applications, the availability of accurate and robust pose estimates increases reliability, whereas in AR scenarios reliable pose estimates decrease jitter. Pure vision sensor based 3D trackers require either manual initializations of pose or off-line training stages. On the other hand, trackers relying on pure depth sensors are not suitable for AR applications. Therefore, an automated and flexible 3D tracking algorithm, which is based on sensor fusion via Extended Kalman Filter (EKF) and weighting of measurements, is proposed in this paper. 2D, as well as 3D, tracking performances are increased significantly with the help of a novel measurement-tracking scheme, based on estimation of optical flow using both intensity and shape index map (SIM) data of 3D point cloud. Requiring neither manual initialization of pose nor offline training, the proposed method enables highly accurate 3D tracking. The performance of the proposed method is tested against real and artificial data and yielded superior results against conventional techniques.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call