Abstract

The sense of touch is essential for locating buried objects when vision-based approaches are limited. We present an approach for tactile perception when sensorized robot fingertips are used to directly interact with granular media particles in teleoperated systems. We evaluate the effects of linear and nonlinear classifier model architectures and three tactile sensor modalities (vibration, internal fluid pressure, fingerpad deformation) on the accuracy of estimates of fingertip contact state. We propose an architecture called the Sparse-Fusion Recurrent Neural Network (SF-RNN) in which sparse features are autonomously extracted prior to fusing multimodal tactile data in a fully connected RNN input layer. The multimodal SF-RNN model achieved 98.7% test accuracy and was robust to modest variations in granular media type and particle size, fingertip orientation, fingertip speed, and object location. Fingerpad deformation was the most informative modality for haptic exploration within granular media while vibration and internal fluid pressure provided additional information with appropriate signal processing. We introduce a real-time visualization of tactile percepts for remote exploration by constructing a belief map that combines probabilistic contact state estimates and fingertip location. The belief map visualizes the probability of an object being buried in the search region and could be used for planning.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.