Abstract
We propose an approach that considers controlling contact between a robot and the environment during physical interactions. Current physical interaction control approaches are limited in terms of the range of tasks that can be performed. To allow robots to perform more tasks, we derive tactile features representing deformations of the mechanically compliant sensing surface of a tactile sensor and incorporate these features to a robot controller, akin to a visual servo, via touch- and task-dependent tactile feature mapping matrices. As a first contribution, we derive tactile features to localize a contact coordinate frame between an object and an array of pressure sensing elements, with a mechanically compliant surface, attached onto a robot arm end-effector interacting with the object. As a second contribution, we propose tactile projection matrices to design a tactile servoing controller that combines these tactile features with a Cartesian impedance controller of the robot arm. These matrices convert the proposed tactile features to balance not only normal forces but also torques about the sensor’s axes. It allows the end-effector to steer the contact frame in a desired manner by regulating errors in the tactile features to address several common issues in robotics: exploration and co-manipulation.
Submitted Version (Free)
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.