Abstract

Humans can perceive tactile directionality with angular perception thresholds of 14-40° via fingerpad skin displacement. Using deformable, artificial tactile sensors, the ability to perceive tactile directionality was developed for a robotic system to aid in object manipulation tasks. Two convolutional neural networks (CNNs) were trained on tactile images created from fingerpad deformation measurements during perturbations to a handheld object. A primary CNN regression model provided a point estimate of tactile directionality over a range of grip forces, perturbation angles, and perturbation speeds. A secondary CNN model provided a variance estimate that was used to determine uncertainty about the point estimate. A 5-fold cross-validation was performed to evaluate model performance. The primary CNN produced tactile directionality point estimates with an error rate of 4.3% for a 20° angular resolution and was benchmarked against an open-source force estimation network. The model was implemented in real-time for interactions with an external agent and the environment with different object shapes and widths. The perception of tactile directionality could be used to enhance the situational awareness of human operators of telerobotic systems and to develop decision-making algorithms for context-appropriate responses by semi-autonomous robots.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call