Abstract

The vision-aided inertial navigation system is of vital importance and great urgency to guarantee accurate and reliable navigation information for unmanned ground vehicles (UGV) in Global Navigation Satellite System-challenged/denied environments. Yet, it is almost inevitable to occur unpredictable stochastic disturbances caused by ego-motion uncertainty due to uneven pavements, unstructured and dynamic scenes, which may lead to the divergence of estimation algorithms and the unreliability of positioning. In this article, we aim at developing a robust strategy for vision-aided inertial navigation to resist uncertainties, and modifying unscented Kalman filter by formulating a novel adaptive factor according to the degree of abnormity (DoA). Applying the robust strategy and algorithms, a monocular camera and an inertial measurement unit (IMU) are tightly fused to construct a robust vision-aided inertial navigation system. During the fusion process of integrated IMU and camera, the new method can automatically estimate the DoA of UGV ego-motion uncertainty online, considering the relationship between the factual and estimated innovation covariances. Thus, we can adaptively adjust the estimation error covariance matrix and the Kalman gain for different scenarios on the basis of DoA levels. As the modification is made in the framework of Kalman filter, the real-time performance, robustness, and high-accuracy positioning can be achieved using the proposed adjustment strategy. Experiments with the KITTI dataset are carried out under different road conditions. The experimental results show that the robust vision-aided inertial navigation system possesses good robustness and self-adaptation for protection against ego-motion uncertainty, and can improve the positioning accuracy of UGV.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call