Abstract

Locating an inspection robot is an essential task for inspection missions and spatial data acquisition. Giving a spatial reference to measurements, especially those concerning environmental parameters, e.g., gas concentrations may make them more valuable by enabling more insightful analyses. Thus, an accurate estimation of sensor position and orientation is a significant topic in mobile measurement systems used in robotics, remote sensing, or autonomous vehicles. Those systems often work in urban or underground conditions, which are lowering or disabling the possibility of using Global Navigation Satellite Systems (GNSS) for this purpose. Alternative solutions vary significantly in sensor configuration requirements, positioning accuracy, and computational complexity. The selection of the optimal solution is difficult. The focus here is put on the assessment, using the criterion of the positioning accuracy of the mobile robot with no use of GNSS signals. Automated geodetic surveying equipment is utilized for acquiring precise ground truth data of the robot’s movement. The results obtained, with the use of several methods, compared: Wheel odometry, inertial measurement-based dead-reckoning, visual odometry, and trilateration of ultra-wideband signals. The suitability, pros, and cons of each method are discussed in the context of their application in autonomous robotic systems, operating in an underground mine environment.

Highlights

  • Determining the sensor position and orientation in space is an important part of the data acquisition process during inspection missions

  • It is crucial for providing spatial context for various measurements, which in turn allows for associating observed phenomena with places of their occurrence and enables them to be a subject of spatial analyses. Data collected for this purpose, such as signals from Global Navigation Satellite Systems (GNSS), laser scans, depth and stereo images or inertial and odometry measurements, can be utilized for obtaining geometrical information about the observer’s path and their surroundings

  • Smoothed data were converted to degrees from radians and cumulatively summed to obtain heading directions for the consecutive positions of the vehicle

Read more

Summary

Introduction

Determining the sensor position and orientation in space is an important part of the data acquisition process during inspection missions It is crucial for providing spatial context for various measurements, which in turn allows for associating observed phenomena with places of their occurrence and enables them to be a subject of spatial analyses. Data collected for this purpose, such as signals from Global Navigation Satellite Systems (GNSS), laser scans, depth and stereo images or inertial and odometry measurements, can be utilized for obtaining geometrical information about the observer’s path and their surroundings. Recent advancements in modern deep learning image processing algorithms are aiming to allow the creation o 3D models with only a monocular camera Another prominent field of study, in which sensor position estimation is crucial, is autonomous robot and vehicle navigation. Numerous data fusion algorithms have been proposed to both prevent an autonomous mission from total failure in described cases and to increase the reliability and accuracy of position estimation [5,6]

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call