Abstract

The navigation of an autonomous robotic vehicle is a difficult task. Accurate measurement of robotic vehicle motion is a problem in certain environments. In desert and other terrains, wheel slip affects the accuracy of odometry sensors. Poorly-lit underground environments present problems for passive vision systems. As well, for slow-moving vehicles, the effects of INS drift errors can be large even over short distances. An active triangulation scanning laser camera sensor, which can provide accurate 3D images at distances less than 10m, has the potential to alleviate the problems mentioned above by improving the accuracy of integrated navigation systems for robotic vehicles operating in such environments. Knowledge of the relative position measurement accuracy for scanning laser cameras in various environments will allow navigation system designers to determine whether incorporating these sensors will help to meet their system accuracy requirements. This paper presents an experimental method for determining relative position measurement accuracy of an auto-synchronous triangulation scanning laser camera. 3D images were taken of a simulated desert terrain environment from multiple camera positions and orientations. Registration of overlapping images using an Iterative Closest Point (ICP) based algorithm was performed to determine an estimate of the position and orientation change of the laser camera. Truth data for the position and orientation of the laser camera at each location was determined by using theodolites to measure the location of survey targets mounted on the laser camera. The relative position estimates were then compared to the truth data. In this paper, the experiment design and implementation are detailed, and preliminary experimental results are presented.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call