Abstract

One of the challenges in robotic grasping tasks is the problem of detecting whether a grip is stable or not. The lack of stability during a manipulation operation usually causes the slippage of the grasped object due to poor contact forces. Frequently, an unstable grip can be caused by an inadequate pose of the robotic hand or by insufficient contact pressure, or both. The use of tactile data is essential to check such conditions and, therefore, predict the stability of a grasp. In this work, we present and compare different methodologies based on deep learning in order to represent and process tactile data for both stability and slip prediction.

Highlights

  • Tactile sensors provide a rich source of information regarding the contact a robotic hand or gripper experiences during a grasp or the manipulation of an object

  • Since we use a non-matrix tactile sensor, our goal is two fold: first, we propose two ways of interpreting non-structured tactile data in order to learn deep features; and second, we report the performance yielded by deep learning techniques on these two tasks using two custom datasets

  • We approach the tasks of grasp stability prediction and direction of slip detection from this learning perspective, so we use raw tactile values as an input for our deep learning models. Despite it has been proved in the literature that mixing multi-modal data like vision, touch, sound and/or proprioception improves the performance of a robotic system [4,29,30,31], we only use tactile perception. This could be seen as a limitation, our objective is to evaluate the performance of these deep learning models using only the registered responses from our unstructured tactile sensors

Read more

Summary

Introduction

Tactile sensors provide a rich source of information regarding the contact a robotic hand or gripper experiences during a grasp or the manipulation of an object. Luo et al [2] reviewed multiple ways a tactile sensor can record pressure or force signals and how they can be processed in order to perceive properties like material stiffness, shape or pose, among others Some examples of these applications are the works of Kerzel et al [3] and Liu et al [4], who approached the task of recognising a material by performing touches throughout its surface, as well as the works of Schmitz et al [5] and Velasco et al [6], who approached the recognition of grasped objects with multi-fingered hands. Van Hoof et al [7]

Objectives
Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.