Abstract

Recently, Simultaneous Localization and Mapping (SLAM) has achieved good performance in a certain environment. However, in a complex indoor environment, the tracking of the position of the mobile robot in Visual SLAM may get lost in some scenes with hardly any features. This paper presents a mobile robot localization method using an asynchronous data fusion formed by a wheel odometer and a visual odometer in the Extended Kalman Filter (EKF) based on the Semantic SLAM system. The first approach uses Semantic SLAM based on ORB-SLAM2 with YOLOv3, to get rid of the dynamic object while localizing and mapping, and build a global point cloud map where keyframes information is stored. Then, wheel odometer information and visual odometer information acquire after using a global point cloud map. Finally, the asynchronous data fusion for mobile robot localization, composed of visual odometer and wheel odometer, is realized at a frequency of about 50Hz in EKF. The experimental results on the real robot validate the method in accuracy and robustness by comparing it with ground truth from the motion capture measurement system in indoor environment.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call