Abstract

Although various designs have been reported in the literature for vision-based tactile sensors, a few of them can detect pressing forces. These designs are often addressed with learning-based approaches that critically rely on training and require large amounts of training data. This study presents a novel visuotactile sensing technology for the development of large-scale vision-based tactile sensors named BiTac, which can accurately reconstruct its 3-D surface shape in real-time and is capable of detecting the bi-directional distribution of normal forces applied to its soft sensing surface. The sensor skin was made of soft and elastic materials composed of an array of markers which are tracked efficiently by a compact stereo camera, whose parameters could be calibrated automatically. Notably, the complexity of mapping the displacement fields calculated from the images to the normal force distribution is resolved by the finite element method, which would be a generalized modeling technique for soft materials. This article comprehensively covers the design, fabrication, theoretical modeling, calibration, and experiments. The hardware capability and tactile perception were assessed quantitatively and qualitatively through real interactions with various objects. With a simple structure and analytical approach, the BiTac sensor can be easily customized and manufactured for different parts of robots, making it a potential design for mass production.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call