Abstract

Accurate and reliable localization is a prerequisite for autonomously performing high-level tasks with humanoid robots. In this paper, we present a probabilistic localization method for humanoid robots navigating in arbitrary complex indoor environments using only onboard sensing, which is a challenging task. Inaccurate motion execution of biped robots leads to an uncertain estimate of odometry, and their limited payload constrains perception to observations from lightweight and typically noisy sensors. Additionally, humanoids do not walk on flat ground only and perform a swaying motion while walking, which requires estimating a full 6D torso pose. We apply Monte Carlo localization to globally determine and track a humanoid's 6D pose in a given 3D world model, which may contain multiple levels and staircases. We present an observation model to integrate range measurements from a laser scanner or a depth camera as well as attitude data and information from the joint encoders. To increase the localization accuracy, e.g., while climbing stairs, we propose a further observation model and additionally use monocular vision data in an improved proposal distribution. We demonstrate the effectiveness of our methods in extensive real-world experiments with a Nao humanoid. As the experiments illustrate, the robot is able to globally localize itself and accurately track its 6D pose while walking and climbing stairs.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.