Abstract

Robotic metamaterials represent an innovative approach to creating synthetic structures that combine desired material characteristics with embodied intelligence, blurring the boundaries between materials and machinery. Inspired by the functional qualities of biological skin, integrating tactile intelligence into these materials has gained significant interest for research and practical applications. This study introduces a Soft Robotic Metamaterial (SRM) design featuring omnidirectional adaptability and superior tactile sensing, combining vision-based motion tracking and machine learning. The study compares two sensory integration methods to a state-of-the-art motion tracking system and force/torque sensor baseline: an internal-vision design with high frame rates and an external-vision design offering cost-effectiveness. The results demonstrate the internal-vision SRM design achieving an impressive tactile accuracy of 98.96%, enabling soft and adaptive tactile interactions, especially beneficial for dexterous robotic grasping. The external-vision design offers similar performance at a reduced cost and can be adapted for portability, enhancing material science education and robotic learning. This research significantly advances tactile sensing using vision-based motion tracking in soft robotic metamaterials, and the open-source availability on GitHub fosters collaboration and further exploration of this innovative technology (https://github.com/bionicdl-sustech/SoftRoboticTongs).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call