Abstract
Mapping and localization of mobile robots in an unknown environment are essential for most high-level operations like autonomous navigation or exploration. This paper presents a novel approach for combining estimated trajectories, namely curvefusion. The robot used in the experiments is equipped with a horizontally mounted 2D profiler, a constantly spinning 3D laser scanner and a GPS module. The proposed algorithm first combines trajectories from different sensors to optimize poses of the planar three degrees of freedom (DoF) trajectory, which is then fed into continuous-time simultaneous localization and mapping (SLAM) to further improve the trajectory. While state-of-the-art multi-sensor fusion methods mainly focus on probabilistic methods, our approach instead adopts a deformation-based method to optimize poses. To this end, a similarity metric for curved shapes is introduced into the robotics community to fuse the estimated trajectories. Additionally, a shape-based point correspondence estimation method is applied to the multi-sensor time calibration. Experiments show that the proposed fusion method can achieve relatively better accuracy, even if the error of the trajectory before fusion is large, which demonstrates that our method can still maintain a certain degree of accuracy in an environment where typical pose estimation methods have poor performance. In addition, the proposed time-calibration method also achieves high accuracy in estimating point correspondences.
Highlights
Autonomous mobile robots have a wide range of applications, including planetary exploration, rescue, disaster scenarios and other tasks that are hazardous or unfeasible for humans
While typical multi-sensor fusion methods mainly focus on the filter framework, we provide a new perspective, i.e., curve deformation, to deal with trajectory fusion
In the first data set, the robot was driven starting in front of the robotics hall around the old ZAE building (Bavarian Center for Applied Energy Research)
Summary
Autonomous mobile robots have a wide range of applications, including planetary exploration, rescue, disaster scenarios and other tasks that are hazardous or unfeasible for humans. Mobile mapping is defined as the data acquisition of the surrounding environment employing a mobile platform with several sensors. I.e., merging the acquired scans or images into a common coordinate system, is an essential step in mobile mapping. The poses from the positioning sensors cannot be used directly due to imprecise measurements. Four main sensors have been developed for robot localization, namely, global navigation satellite systems (GNSSs), inertial navigation systems (INSs), vision-based and laser-based methods. The GNSS is a simple and widely used solution for localization
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have