Abstract
Based on indications from the neuroscience and psychology, both perception and action can be internally simulated by activating sensor and motor areas in the brain without external sensory input or without any resulting overt behavior. This hypothesis, however, can be highly useful in the real robot applications. The robot, for instance, can cover some of the corrupted sensory inputs by replacing them with its internal simulation. The accuracy of this hypothesis is strongly based on the agentpsilas experiences. As much as the agent knows about the environment, as much as it can build a strong internal representation about it. Although many works have been presented regarding to this hypothesis with various levels of success. At the sensorimotor abstraction level, where extracting data from the environment occur, however, none of them have so far used the robotpsilas vision as a sensory input. In this study, vision-sensorimotor abstraction is presented through memory-based learning in a real mobile robot ldquoHemissonrdquo to investigate the possibilities of explaining its inner world based on internal simulation of perception and action at the abstract level. The analysis of the experiments illustrate that our robot with vision sensory input has developed some kind of simple associations or anticipation mechanism through interacting with the environment, which enables, based on its history and the present situation, to guide its behavior in the absence of any external interaction.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.