Abstract
The sense of touch is a key aspect in the human capability to robustly grasp and manipulate a wide variety of objects. Despite many years of development, there is still no preferred solution for tactile sensing in robotic hands: multiple technologies are available, each one with different benefits depending on the application. This study compares the performance of different tactile sensors mounted on the variable stiffness gripper CLASH 2F, including three commercial sensors: a single taxel sensor from the companies Tacterion and Kinfinity, the Robotic Finger Sensor v2 from Sparkfun, plus a self-built resistive 3 × 3 sensor array, and two self-built magnetic 3-DoF touch sensors, one with four taxels and one with one taxel. We verify the minimal force detectable by the sensors, test if slip detection is possible with the available taxels on each sensor, and use the sensors for edge detection to obtain the orientation of the grasped object. To evaluate the benefits obtained with each technology and to assess which sensor fits better the control loop in a variable stiffness hand, we use the CLASH gripper to grasp fruits and vegetables following a published benchmark for pick and place operations. To facilitate the repetition of tests, the CLASH hand is endowed with tactile buttons that ease human–robot interactions, including execution of a predefined program, resetting errors, or commanding the full robot to move in gravity compensation mode.
Highlights
In real-world scenarios, robotic grasping is still a challenge due to the variation of object properties such as shape, weight, and friction, on top of the problems arising from the robotic components, for example, a vision system that cannot accurately identify a partially occluded object in cluttered scenes or a hand that cannot robustly hold the grasp when an unexpected collision happens
To provide a standard baseline for comparison, in this study we evaluate a number of tactile sensors integrated on the same gripper, the CLASH 2F, based on the technology of the CLASH 3F hand (Friedl et al, 2018)
To facilitate the interaction with the gripper and to ease teaching of poses required for the tests, we developed a user interface that can be mounted on the CLASH 2F
Summary
In real-world scenarios, robotic grasping is still a challenge due to the variation of object properties such as shape, weight, and friction, on top of the problems arising from the robotic components, for example, a vision system that cannot accurately identify a partially occluded object in cluttered scenes or a hand that cannot robustly hold the grasp when an unexpected collision happens. To facilitate the interaction with the gripper and to ease teaching of poses required for the tests, we developed a user interface that can be mounted on the CLASH 2F ( on the CLASH 3F) This user interface helps to teach grasp poses of the hand and to access basic control functions of the robot (e.g., gravity compensation). It delivers status information of the robot and the hand by providing visual feedback to the user, which greatly facilitates solving unexpected errors during operation. The additional information from the implemented sensors allows us to successfully grasp all objects in the second tested scenario, a crate filled with one full layer of punnets. We can detect slippage and objects that rotate inside the hand, and adapt the grasp or the arm motion
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.