Abstract

Humanoid robots have become appealing to the research community because of their potential versatility. However, traditional programming approaches may not reveal their full capabilities. Thus, an important goal is to develop a humanoid robot that can learn to perform complex tasks by itself. This paper proposes a method to recognize and regenerate motion in a humanoid robot. We demonstrate how a sequence of high-dimensional motion data can be automatically segmented into abstract action classes. The sequence from a 25-d.o.f. humanoid robot performing a ball tracking task is reduced to its intrinsic dimensionality by non-linear principal component analysis (NLPCA). The motion data is then segmented automatically by incrementally generating NLPCA networks with a circular constraint and assigning to these networks data points according to their temporal order in a conquer-and-divide fashion. Repeated motion patterns are removed based on their proximity to similar motion patterns in the reduced sensorimotor space to derive a nonredundant set of abstract actions. The networks abstracted five motion patterns without any prior information about the number or type of motion patterns. We ensured the motion reproduction by employing a motion optimization algorithm based on the learning of the sensorimotor mapping in the low-dimensional space.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call