Abstract
Global localization, which determines an accurate global position without prior knowledge, is a fundamental requirement for a mobile robot. Map-based global localization gives a precise position by comparing a provided geometric map and current sensory data. Although 3D range data is preferable for 6D global localization in terms of accuracy and reliability, comparison with large 3D data is quite time-consuming. On the other hand, appearance-based global localization, which determines the global position by comparing a captured image with recorded ones, is simple and suitable for real-time processing. However, this technique does not work in the dark or in an environment in which the lighting conditions change remarkably. We have proposed a two-step strategy so far, which combines map-based global localization and appearance-based global localization. Instead of camera images, which are used for appearance-based global localization, we use reflectance images, which are captured by a laser range finder as a byproduct of range sensing. However, since this method relies on the similarity between newly-captured and stored reflectance images, the performance is deteriorated in a scene with high perceptual aliasing containing few or similar features such as a corridor and a mine. To deal with this problem, in this paper, we propose a new global localization technique which combines the proposed two-step strategy and the particle filter. The effectiveness of the proposed technique is demonstrated through experiments in real environments.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have