Abstract

In this paper, we propose a set of novel tactile descriptors to enable robotic systems to extract robust tactile information during tactile object explorations, regardless of the number of the tactile sensors, sensing technologies, type of exploratory movements, and duration of the objects’ surface exploration. The performance and robustness of the tactile descriptors are verified by testing on four different sensing technologies (dynamic pressure sensors, accelerometers, capacitive sensors, and impedance electrode arrays) with two robotic platforms (one anthropomorphic hand and one humanoid), and with a large set of objects and materials. Using our proposed tactile descriptors, the Shadow Hand, which has multimodal robotic skin on its fingertips, successfully classified 120 materials (100% accuracy) and 30 in-hand objects (98% accuracy) with regular and irregular textural structure by executing human-like active exploratory movements on their surface. The robustness of the proposed descriptors was assessed further during the large object discrimination with a humanoid. With a large sensing area on its upper body, the humanoid classified 120 large objects with multiple weights and various textures while the objects slid between its sensitive hands, arms, and chest. The achieved 90% recognition rate shows that the proposed tactile descriptors provided robust tactile information from the large number of tactile signals for identifying large objects via their surface texture regardless of their weight.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call