Abstract

In this paper, a novel vision-based measurement (VBM) approach is proposed to estimate the contact force and classify materials in a single grasp. This approach is the first event-based tactile sensor which utilizes the recent technology of neuromorphic cameras. This novel approach provides higher sensitivity, a lower latency, and less computational and power consumption compared to other conventional vision-based techniques. Moreover, the dynamic vision sensor (DVS) has a higher dynamic range which increases the sensor sensitivity and performance in poor lighting conditions. Two time-series machine learning methods, namely, time delay neural network (TDNN) and Gaussian process (GP) are developed to estimate the contact force in a grasp. A deep neural network (DNN) is proposed to classify the object materials. Forty-eight experiments are conducted for four different materials to validate the proposed methods and compare them against a piezoresistive force sensor measurements. A leave-one-out cross-validation technique is implemented to evaluate and analyze the performance of the proposed machine learning methods. The contact force is successfully estimated with a mean squared error of 0.16 and 0.17 N for TDNN and GP, respectively. Four materials are classified with an average accuracy of 79.17% using unseen experimental data. The results show the applicability of event-based sensors for grasping applications.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.