Abstract
In this paper we describe a control framework that integrates tactile and force sensing to regulate the phys- ical interaction of an anthropomorphic robotic arm with the external environment. In particular, we exploit tactile sensors distributed on the robot fingers and a 6-axis force/torque sensor place at the bottom of the arm, just below the shoulder. Due to their different mounting locations and sensitivity, the sensors provide different types of contact information; their integration allows to deal with both slight and hard contacts by performing different control strategies depending on the location and the intensity of the contact. We provide real-world experimental results that show how a humanoid torso equipped with moving head, eyes, arm and hand can realize visually guided reaching dealing with different types of unexpected contacts with the environment.
Paper version not known (Free)
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have