Abstract

Tactile sensors are used to detect physical contact or pressure. They provide feedback about the physical environment and allow more natural and intuitive interaction with machines. Tactile sensors have many applications in the fields of agriculture, space exploration, health and automotive. Capacitive, resistive, as well as vision (optical) based tactile sensors have been proposed in the literature. This paper proposes a novel approach to solving the problem of estimating the contact locations in the event of simultaneous multiple contacts in vision-based tactile sensors. The relationship between the contact force and the resulting physical deformation of the sensor material of a large-scale tactile sensor was studied with the aid of a custom-built hardware unit. Hardware architecture consists of a custom-designed flat rectangular sensor surface coupled with a mono-vision camera to capture the surface deformation. This method can capture detailed information on the resulting deformation for multiple simultaneous contacts. A software -based deformation estimation algorithm is proposed, where the grid array of marker positions was estimated with a tracking algorithm, an estimation algorithm, and a graphical representation algorithm. Moreover, separate analyses have been carried out to find the best suitable method to observe the deformation of the sensor material. In this study, the approach that was taken to find the contact position and deformation, produced results with an accuracy of more than 97%. Consequently, these results show that this method outperforms existing state-of-the-art techniques in terms of accuracy in the detection of the contact position. KEYWORDS: Vision-based tactile sensors, Surface deformation, marker-based localization, Contact point estimation

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call