Abstract

Visual inertial odometry (VIO) is the front-end of visual simultaneous localization and mapping (vSLAM) methods and has been actively studied in recent years. In this context, a time-of-flight (ToF) camera, with its high accuracy of depth measurement and strong resilience to ambient light of variable intensity, draws our interest. Thus, in this paper, we present a realtime visual inertial system based on a low cost ToF camera. The iterative closest point (ICP) methodology is adopted, incorporating salient point-selection criteria and a robustness-weighting function. In addition, an error-state Kalman filter is used and fused with inertial measurement unit (IMU) data. To test its capability, the ToF–VIO system is mounted on an unmanned aerial vehicle (UAV) platform and operated in a variable light environment. The estimated flight trajectory is compared with the ground truth data captured by a motion capture system. Real flight experiments are also conducted in a dark indoor environment, demonstrating good agreement with estimated performance. The current system is thus shown to be accurate and efficient for use in UAV applications in dark and Global Navigation Satellite System (GNSS)-denied environments.

Highlights

  • The study is motivated and inspired by the need to design and build an efficient and low-cost robot-mount localization system for indoor industrial facility inspection, tunnel inspection, and search and rescue missions inside earthquake- or explosion-damaged buildings in poor lighting and Global Navigation Satellite System (GNSS)-denied conditions

  • Salient point selection with a statistic based weight factor is applied in the basic iterative closest point (ICP) algorithm, and the estimation results are fused with the inertial measurement unit (IMU) data using an error state Kalman filter

  • We have described the development of a ToF camera-based Visual inertial odometry (VIO) system

Read more

Summary

Introduction

The study is motivated and inspired by the need to design and build an efficient and low-cost robot-mount localization system for indoor industrial facility inspection, tunnel inspection, and search and rescue missions inside earthquake- or explosion-damaged buildings in poor lighting and GNSS-denied conditions. Vision-based state estimation has become one of the most active research areas in the past few years and there have been many notable VO, VIO or VSLAM works, such as parallel tracking and mapping (PTAM) [1], SVO+MSF (semidirect visual odometry + multiple sensor fusion) [2,3,4], MSCKF (multi-state constrain Kalman filter) [5], OKVIS FAST and rotated BRIEF-SLAM) [9], VI-DSO (visual–inertial direct sparse odometry) [10] and KinectFusion [11]. These works can be categorized by their pose estimation methods and the ways of fusing IMU data (Table 1). Approaches to depth information based pose estimation can generally use one of two optimization methods: the direct optimization method and the iterative closest point (ICP) method

Estimation Method and Input
ToF–VIO Platform
Visual Odometry by the ToF Camera Module
Selection of Sailent Points
Background
Acception Rules
Weighting of the Point Pairs
Modelling Equations of IMU Sensors
Error State Kalman Filter
Kinematics of True and Nominal State
Updating Process of Error State and Covariance Matrix
ESKF Innovation Process
Integration and Re-Integration
Background outlier threshold
Handheld Test
Method
UAV Test in LAB Environment
Field Test in Corridor
Exploration Test Using a Ground Moving Platform
Analysis
Conclusions

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.