Abstract

A method of computing humanoid robot joint angles from human motion data is presented in this paper. The proposed method groups the motors of an upper-body humanoid robot into pan-tilt and spherical modules, solves the inverse kinematics (IK) problem for each module, and finally resolves the coordinate transformations among the modules to yield the full solution. Scaling of the obtained joint angles and velocities is performed to ensure that their limits are satisfied and their smoothness preserved. To address robustness-accuracy tradeoff when handling kinematic singularity, we design an adaptive regularization parameter that is active only when the robot is operating near any singular configuration. This adaptive algorithm is provably robust and is simple and fast to compute. Simulation on a seven degree-of-freedom (DOF) robot arm shows that tracking accuracy is slightly reduced in a neighborhood of a singularity to gain robustness, but high accuracy is recovered outside this neighborhood. Experimentation on a 17-DOF upper-body humanoid robot confirms that user-demonstrated gestures are closely imitated by the robot. The proposed method outperforms state-of-the-art Jacobian-based IK in terms of overall imitation accuracy, while guaranteeing robust and smoothly scaled trajectories. It is ideally suited for applications such as humanoid robot teleoperation or programming by demonstration.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.