Abstract
In the current paper we investigate the challenges of localizing walking humanoid robots using Visual SLAM (VSLAM). We propose a novel dense RGB-D SLAM framework that seamlessly integrates with the dynamic state of a humanoid, to provide real-time localization and dense mapping of its surroundings. Following the path of recent research in humanoid localization, in the current work we explore the integration between a VSLAM system and the humanoid state, by considering the gait cycle and the feet contacts. We analyze how these effects undermine the quality of data acquisition and association for VSLAM, by capturing the unilateral ground forces at the robot’s feet, and design a system that mitigates their impact.We evaluate our framework on both open and closed-loop bipedal gaits, using a low-cost humanoid platform, and demonstrate that it outperforms kinematic odometry and state-of-the-art dense RGB-D VSLAM methods, by continuously localizing the robot, even in the face of highly irregular and unstable motions.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have