Abstract

Robotic mapping and odometry are the primary competencies of a navigation system for an autonomous mobile robot. However, the state estimation of the robot typically mixes with a drift over time, and its accuracy is degraded critically when using only proprioceptive sensors in indoor environments. Besides, the accuracy of an ego-motion estimated state is severely diminished in dynamic environments because of the influences of both the dynamic objects and light reflection. To this end, the multi-sensor fusion technique is employed to bound the navigation error by adopting the complementary nature of the Inertial Measurement Unit (IMU) and the bearing information of the camera. In this paper, we propose a robust tightly-coupled Visual-Inertial Navigation System (VINS) based on multi-stage outlier removal using the Multi-State Constraint Kalman Filter (MSCKF) framework. First, an efficient and lightweight VINS algorithm is developed for the robust state estimation of a mobile robot by practicing a stereo camera and an IMU towards dynamic indoor environments. Furthermore, we propose strategies to deal with the impacts of dynamic objects by using multi-stage outlier removal based on the feedback information of estimated states. The proposed VINS is implemented and validated through public datasets. In addition, we develop a sensor system and evaluate the VINS algorithm in the dynamic indoor environment with different scenarios. The experimental results show better performance in terms of robustness and accuracy with low computation complexity as compared to state-of-the-art approaches.

Highlights

  • In recent years, the field of robotics has witnessed remarkable advances in both academia and the industry with the assistance of Artificial Intelligence (AI)

  • We provide the design of a robust stereo vision aided-inertial navigation system with high accuracy and low computational cost based on multi-stage outlier removal towards resource-constrained devices for stochastic environments; Multi-stage outlier removal strategies are introduced, in particular an approach with a multi-stage that removes outlier features caused by the influences of the dynamic objects based on the state feedback information in the sliding window is proposed; The experimental evaluation of the proposed Visual-Inertial Navigation System (VINS) algorithm is carried out on both public and our datasets for the indoor environment

  • We evaluate the proposed algorithm without using Simultaneous Localization and Mapping (SLAM) landmarks supported by the EuRoC dataset

Read more

Summary

Introduction

The field of robotics has witnessed remarkable advances in both academia and the industry with the assistance of Artificial Intelligence (AI). The evolution of technology has awakened various real-time applications of the mobile robot, such as search and rescue missions in disasters, ship and deliver small packages, and self-driving vehicles [1,2]. The localization system of an autonomous mobile robot is a crucial competence. The system typically involves an accumulation error over time, in long-term navigation. Kinematic (RTK)-GPS can provide outstanding precision for outdoor localization, it is not suitable for indoor or GPS-denied outdoor environments such as under bridges or high buildings. To this end, the Multi-Sensor Fusion Technique (MSFT) is developed to bound the navigation error by joining the advantages of proprioceptive and exteroceptive sensors. It enables loop closure detection to enhance the accuracy

Objectives
Results
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.