Abstract

Humans acquire body awareness through a process of sensorimotor development that starts in early infancy [1]. Such awareness is supported by a neural representation of the body that can be used to infer the limbs’ position in space and guide motor behaviors: a body schema. Considering more specifically the visual based control of reaching, a form of visual-proprioceptive calibration of the body might be performed by infants during the first months of life, as they spend a lot of time observing themselves while moving [2]. Moreover, the theory of human motor learning and control postulates that forward and inverse internal models of the limbs are learned and kept up-to-date in the cerebellum[3]. For example, forward model can be combined with the actual sensory feedback through Bayesian integration to improve the estimation of the current state of the system [4]. Our objective is to improve the accuracy of an analytical model of the robot using visual information and Bayesian estimation techniques. In particular, we consider a visual based reaching scenario using the iCub humanoid robot and instead of learning an internal model from scratch, we exploit the iKin kinematic model of the robot [5] provided within the YARP/iCub software framework, and we adapt it online during reaching movements in order to cope with the modeling inaccuracies, allowing the robot to precisely reach for a desired position and orientation. In the future work we want to use this improved internal model to do visual based hand control in grasping tasks since precise grasping requires a good awareness of the hand pose and it is typically very difficult to obtain an accurate analytical model of the system, due to hard-to-model aspects (e.g. elasticity) and changes that might occur over time (e.g. unalignment of a joint rotation axis)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.