Abstract

A conventional blind walking algorithm has low walking stability on uneven terrain because a robot cannot rapidly respond to height changes of the ground due to limited information from foot force sensors. In order to cope with rough terrain, it is essential to obtain 3D ground information. Therefore, this paper proposes a vision-guided six-legged walking algorithm for stable walking on uneven terrain. We obtained noise-filtered 3D ground information by using a Kinect sensor and experimentally derived coordinate transformation information between the Kinect sensor and robot body. While generating landing positions of the six feet from the predefined walking parameters, the proposed algorithm modifies the landing positions in terms of reliability and safety using the obtained 3D ground information. For continuous walking, we also propose a ground merging algorithm and successfully validate the performance of the proposed algorithms through walking experiments on a treadmill with obstacles.

Highlights

  • The 2011 nuclear power plant explosion in Fukushima, Japan, was the result of a powerful earthquake and massive tsunami

  • The Defense Advanced Research Project Agency (DARPA) in 2012 launched a robotics challenge project requesting robots to perform tasks specified for disaster circumstances, and Team ViGIR realized those tasks through computer simulations in 2013 [1]

  • Feng S. et al generated walking paths considering the center of mass (CoM) and performed simulations and experiments of biped walking and ladder climbing using the humanoid robot ATLAS (Boston Dynamics Co., Boston, MA, USA) [2]

Read more

Summary

Introduction

The 2011 nuclear power plant explosion in Fukushima, Japan, was the result of a powerful earthquake and massive tsunami. Belter D. et al determined six DOF poses of a six-legged robot with a monocular vision system using the Parallel Tracking and Mapping (PTAM) algorithm and Inertial Measurement Unit (IMU) They used a self-localization system together with an RRT-based motion planner, which allows a robot to walk autonomously on unknown rough terrain [12]. Ramos O.E. et al proposed a method for stable biped walking on rough terrain using inverse dynamics control and stereo vision information [14] They successfully implemented a stable 3D dynamic walking simulation by using the ground shape information obtained from a depth camera attached to the robot head. The proposed algorithm can adjust the foot landing position by considering the reliability and safety of the landing area by merging 3D ground shape data obtained from the Kinect sensor.

48 V Maxon BLDC motors with harmonic gears
Image Acquisition
Error Evaluation
Image Post-Processing
Landing Position Modification Algorithm
Experiment
Findings
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.