Abstract

In this article, we present a low-cost multimodal tactile sensor capable of providing accelerometer, gyroscope, and pressure data using a seven-axis chip as a sensing element. This approach reduces the complexity of the tactile sensor design and collection of multimodal data. The tactile device is composed of a top layer (a printed circuit board (PCB) and a sensing element), a middle layer (soft rubber material), and a bottom layer (plastic base) forming a sandwich structure. This approach allows the measurement of multimodal data when force is applied to different parts of the top layer of the sensor. The multimodal tactile sensor is validated with analyses and experiments in both offline and real-time. First, the spatial impulse response and sensitivity of the sensor are analyzed with accelerometer, gyroscope, and pressure data systematically collected from the sensor. Second, the estimation of contact location from a range of sensor positions and force values is evaluated using accelerometer and gyroscope data together with a convolutional neural network (CNN) method. Third, the estimation of contact location is used to control the position of a robot arm. The results show that the proposed multimodal tactile sensor has the potential for robotic applications, such as tactile perception for robot control, human–robot interaction, and object exploration.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call