Abstract

Dexterous robotic manipulation in unstructured environments is still challenging, despite the increasing number of robots entering human settings each day. Even though robotic manipulation provides complete solutions in factories and industries, it still lacks essential techniques, displaying clumsy or limited operation in unstructured environments. Daily objects typically aim at the human hand, and the human somatosensory system is responsible for solving all the complex calculations required for dexterous manipulations in unstructured settings. Borrowing concepts of the human visuotactile system can improve dexterous manipulation and increase robotics usage in unstructured environments. In humans, required finger and wrist joint adjustments occur after fast identification of the object in the initial stages of manipulation. Fast object identification during those phases may increase robotic dexterous manipulation performance. The present paper explores human-inspired concepts such as haptic glance to develop robotic single-grasp object identification. This concept can assist early phases of robotic manipulation, helping automated decision-making, such as type of grasp and joint position, during manipulation tasks. The main stages developed here are detecting sensor activation and sample collection using signal-to-noise and z-score filtering on tactile data. This procedure automates touch detection and reduces the sensor space for classification. Experiments on a daily objects dataset presented compelling results that will assist in the later stages of the early phases of robotic grasping.

Highlights

  • The growing integration of robots in homes and hospitals has been developing robotic manipulation research in unstructured environments

  • This paper provides early results about automating sensor selection and peak detection for early phases of tactile manipulation

  • A comparison between the filtered and unfiltered signal-to-noise ratios of tactile data provides for initial sensor selection

Read more

Summary

Introduction

The growing integration of robots in homes and hospitals has been developing robotic manipulation research in unstructured environments. A big step is required in sensing and data processing to achieve human-level robotic manipulation in such environments. One aspect of human manipulation is called haptic glance sensing, which concerns fast object recognition in the absence of visual stimuli. Humans are capable of fast object recognition in occluded environments using only tactile feedback [2] in the first phases of manipulation.

Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.