Abstract

Simultaneous Localization and Mapping (SLAM) is a fundamental problem for autonomous mobile robots (AMRs). AMRs are widely used in automated warehousing, factory material transfer systems, flexible assembly systems, and other intelligent transportation sites. The visual Inertial Odometry (VIO) which consists of the camera and inertial-measurement-unit (IMU), is a popular approach to achieve accurate 6-DOF state estimation. However, such locally accurate VIO is prone to drift and cannot provide a global consistent map. The prerequisite for re-localizing the robot and ensuring precise autonomous navigation is an accurate and global consistent map of its environment. In this study, we propose a stereo visual-inertial mapping system. The front-end is a robust stereo VIO based on a tightly-coupled sliding window optimization. The core of the back-end is the global Bundle-Adjustment (BA) which is a nonlinear optimization, in which IMU is also added as a time-domain constraint. Meanwhile, stereo-camera-IMU extrinsic calibration is performed in BA to improve mapping accuracy. The selection principles of keyframes and map points are also designed according to the AMRs application characteristics. Further, the forward and backward Perspective-n-Point (PNP) method is also adopted to avoid the loop-detection mismatch. The performance of the system was validated and compared against other state-of-the-art algorithms. The findings revealed the effectiveness and robustness of this stereo visual-inertial mapping algorithm.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call