Abstract
Tactile data processing and analysis is still essentially an open challenge. In this framework, we demonstrate a method to achieve touch modality classification using pre-trained convolutional neural networks (CNNs). The 3D tensorial tactile data generated by real human interactions on an electronic skin (E-Skin) are transformed into 2D images. Using a transfer learning approach formalized through a CNN, we address the challenging task of the recognition of the object that was touched by the E-Skin. The feasibility and efficiency of the proposed method are proven using a real tactile dataset outperforming classification results obtained with the same dataset in the literature.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.