Abstract

Humanoid robots are expected to be integrated into daily life. This requires the robots to perform human-like actions that are easily understandable by humans. Learning by imitation is an effective framework that enables the robots to generate the same motions that humans do. However, it is generally not useful for the robots to generate motions that are precisely the same as learned motions because the environment is likely to be different from the environment where the motions were learned. The humanoid robot should synthesize motions that are adaptive to the current environment by modifying learned motions. Previous research encoded captured human whole-body motions into hidden Markov models, which are hereafter referred to as motion primitives, and generated human-like motions based on the acquired motion primitives. The contact between the body and the environment also needs to be controlled, so that the humanoid robot’s whole-body motion can be realized in its current environment. This paper proposes a novel approach to synthesizing kinematic data using the motion primitive and controlling the torques of all the joints in the humanoid robot to achieve the desired whole-body motions and contact forces. The experiments demonstrate the validity of the proposed approach to synthesizing and controlling whole-body motions by humanoid robots.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.