Abstract

This paper presents a new Extended Kalman Filter (EKF) for Simultaneous Localization and Mapping (SLAM) of a mobile robot using a 3D Time Of Flight (TOF) camera. The proposed method employs an egomotion estimation algorithm to estimate the camera's egomotion, which is then used as the motion model by the EKF to estimate the camera's pose in the world coordinate system by tracking a set of visual features from the previous intensity images. Unlike the existing work that attempts to maintain the EKF's consistency by building local maps and then joining the maps into a global one, the proposed method tests the EKF's consistency at each step and restarts the filter whenever it fails the test. By doing so, the proposed EKF retains its consistency and therefore improve the accuracy of SLAM. The filter consistency is evaluated online by a chi-square test on the time-averaged normalized innovation squared of each stable feature's observation. Experimental results in various indoor and outdoor terrains demonstrate that proposed approach substantially improves the accuracy of SLAM and thus the accuracy of 3D maps of the environments.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call