Abstract

Facing and pointing toward moving targets is a usual and natural behavior in daily life. Social robots should be able to display such coordinated behaviors in order to interact naturally with people. For instance, a robot should be able to point and look at specific objects. This is why, a scheme to generate coordinated head-arm motion for a humanoid robot with two degrees-of-freedom for the head and seven for each arm is proposed in this paper. Specifically, a virtual plane approach is employed to generate the analytical solution of the head motion. A quadratic program (QP)-based method is exploited to formulate the coordinated dual-arm motion. To obtain the optimal solution, a simplified recurrent neural network is used to solve the QP problem. The effectiveness of the proposed scheme is demonstrated using both computer simulation and physical experiments.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call