Abstract

Progress in the field of humanoid robotics and the need to find simpler ways to program such robots has prompted research into computational models for robotic learning from human demonstration. To further investigate biologically inspired human-like robotic movement and imitation, we have constructed a framework based on three key features of human movement and planning: optimality, modularity and learning. In this paper we describe a computational motor system, based on the minimum variance model of human movement, that uses optimality principles to produce human-like movement in a robot arm. Within this motor system different movements are represented in a modular structure. When the system observes a demonstrated movement, the motor system uses these modules to produce motor commands which are used to update an internal state representation. This is used so that the system can recognize known movements and move the robot arm accordingly, or extract key features from the demonstrated movement and use them to learn a new module. The active involvement of the motor system in the recognition and learning of observed movements has its theoretical basis in the direct matching hypothesis and the use of a model for human-like movement allows the system to learn from human demonstration.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call