Abstract

In this paper we present a neurally plausible model of human infant reaching that is based on embodied artificial intelligence, which emphasizes the importance of the sensorimotor interaction of an agent and the world. This model encompasses both learning sensorimotor correlations through motor babbling and also arm motion planning using spreading activation. This model is organized in three layers of neural maps with a parallel structure representing the same sensorimotor space. The motor babbling period shapes the structure of the three neural maps as well as the connections within and between them. We describe an implementation of this model and an investigation of this implementation using a simple reaching task on a humanoid robot. The robot has learned successfully to plan reaching motions from a test set with high accuracy and smoothness.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.