Abstract
In this paper, we present a new approach to realize whole-body tactile interactions with a self-organizing, multi-modal artificial skin on a humanoid robot. We, therefore, equipped the whole upper body of the humanoid HRP-2 with various patches of CellulARSkin – a modular artificial skin. In order to automatically handle a potentially high number of tactile sensor cells and motors units, the robot uses open-loop exploration motions, and distributed accelerometers in the artificial skin cells, to acquire its self-centered sensory-motor knowledge. This body self-knowledge is then utilized to transfer multi-modal tactile stimulations into reactive body motions. Tactile events provide feedback on changes of contact on the whole-body surface. We demonstrate the feasibility of our approach on a humanoid, here HRP-2, grasping large and unknown objects only via tactile feedback. Kinesthetically taught grasping trajectories, are reactively adapted to the size and stiffness of different test objects. Our paper contributes the first realization of a self-organizing tactile sensor-behavior mapping on a full-sized humanoid robot, enabling a position controlled robot to compliantly handle objects.
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.