Abstract

This paper presents a novel approach for estimating the ego-motion of a vehicle in dynamic and unknown environments using tightly-coupled inertial and visual sensors. To improve the accuracy and robustness, we exploit the combination of point and line features to aid navigation. The mathematical framework is based on trifocal geometry among image triplets, which is simple and unified for point and line features. For the fusion algorithm design, we employ the Extended Kalman Filter (EKF) for error state prediction and covariance propagation, and the Sigma Point Kalman Filter (SPKF) for robust measurement updating in the presence of high nonlinearities. The outdoor and indoor experiments show that the combination of point and line features improves the estimation accuracy and robustness compared to the algorithm using point features alone.

Highlights

  • Reliable navigation in dynamic and unknown environments is a key requirement for many applications, for autonomous ground, underwater and air vehicles

  • Fast and highly dynamic motions can be precisely tracked by an Inertial Measurement Unit (IMU) in a short time, and the problem of scale ambiguity and large latency in vision can be settled to a certain extent

  • We propose a method that combines point and line features for navigation aiding in a simple and unified framework

Read more

Summary

Introduction

Reliable navigation in dynamic and unknown environments is a key requirement for many applications, for autonomous ground, underwater and air vehicles. The most common sensor modality used to tackle this problem is the Inertial Measurement Unit (IMU). The complementary frequency responses and noise characteristics of vision and inertial sensors address the respective limitations and deficiencies [10]. Fast and highly dynamic motions can be precisely tracked by an IMU in a short time, and the problem of scale ambiguity and large latency in vision can be settled to a certain extent. The low-frequency drift in the inertial measurements can be significantly controlled by visual observations. Both cameras and IMUs are low cost, light-weight and low power-consumption devices, which make them ideal for many payload-constrained platforms. Corke [10] has presented a comprehensive introduction of these two sensory modalities from a biological and an engineering perspective

Objectives
Methods
Results
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call