Abstract

One of the most important and challenging tasks for mobile robots that navigate autonomously is localisation – the process whereby a robot locates itself within a map of a known environment or with respect to a known starting point within an unknown environment. Localisation of a robot in an unknown environment is done by tracking the trajectory of the robot on the basis of the initial pose. Trajectory estimation becomes a challenge if the robot is operating in an unknown environment that has a scarcity of landmarks, is GPS-denied, has very little or no illumination, and is slippery – such as in underground mines. This paper attempts to solve the problem of estimating a robot’s trajectory in underground mining environments using a time-of-flight (ToF) camera and an inertial measurement unit (IMU). In the past, this problem has been addressed by using a 3D laser scanner; but these are expensive and consume a lot of power, even though they have high measurement accuracy and a wide field of view. Here, trajectory estimation is accomplished by the fusion of ego-motion provided by the ToF camera with measurement data provided by a low cost IMU. The fusion is performed using the Kalman filter algorithm on a mobile robot moving on a 2D planar surface. The results show a significant improvement on the trajectory estimation. A Vicon system is used to provide groundtruth for the trajectory estimation. Trajectory estimation only using the ToF camera is prone to errors, especially when the robot is rotating; but the fused trajectory estimation algorithm is able to estimate accurate ego-motion even when the robot is rotating. OPSOMMING

Highlights

  • There has been a lot of interest recently in autonomous mobile robots, especially those being used in dangerous work environments

  • While the mobile robot is in operation, it must be able to navigate within the environment

  • The final system is supposed to operate in an underground mine environment, but because

Read more

Summary

INTRODUCTION

There has been a lot of interest recently in autonomous mobile robots, especially those being used in dangerous work environments. The problem of localisation becomes a major challenge if the robot is operating in unknown environments that have few landmarks, are GPS-denied, have very little or no illumination, and are slippery – such as underground mine stopes. In such an environment, the robot is localised by tracking the trajectory of the robot, given the initial pose. Trajectory estimation in environments like underground mines has been successfully achieved by using three-dimensional (3D) laser scanners [1] [2] [3] These provide high accuracy measurements with a wide field of view, making it easy to estimate the transformation between two scans, using registration algorithms such as Iterative closest point (ICP) [4]. The rest of the paper is organised as follows: Section 2 gives the background and related work on ToF camera ego-motion and the fusion of the camera's ego-motion with IMU data; Section 3 describes the design and analysis of the full system; Section 4 gives the experimental results and discussions; and Section 5 presents the conclusion and discusses future work

ToF camera ego-motion estimation
Fusion of ToF camera ego-motion and IMU data
DESIGN AND ANALYSIS
Jump edge filter
Non-homogeneous illumination
Phase wrap filter
Fused ToF-IMU ego-motion estimation
EXPERIMENTAL RESULTS AND DISCUSSIONS
CONCLUSION AND FUTURE WORK

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.