Abstract

Autonomous navigation of mobile robots in complex environments is challenging. Solving the problems of inaccuracy localization and frequent tracking losses of mobile robots in challenging scenes is beyond the power of point-based visual simultaneous localization and mapping (vSLAM). This paper proposes a real-time and robust point-line based monocular visual inertial SLAM (VINS) system for mobile robots of smart cities towards 6G. To extract robust line features for tracking in challenging scenes, EDLines with adaptive gamma correction is adopted to fast extract a larger ratio of long line features among all extracted line features. A real-time line feature matching approach is proposed to track the extracted line features between adjacent frames without the need of computing descriptors. Compared with LSD and KNN matching method based on LBD descriptors, the proposed method runs three times faster. Furthermore, a tightly coupled sensor fusion optimization framework is constructed for accurate state estimation, which contains point-line feature reprojection errors and IMU residuals. By evaluating on public benchmark datasets, our VINS system has high localization accuracy, real-time performance and robustness compared with other advanced SLAM systems. Our VINS system enables mobile robots to locate accurately in smart cities with complex environments.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.