Abstract

In order to obtain and analyze the operator’s intention comprehensively and accurately in human-robot interaction, an array-type flexible tactile sensor was designed. The sensor was encapsulated into a tactile handle to sense the grasping state of the human hand in real time. According to the analysis of different operators’ grasping posture and grasping habits, the grasping state was defined as 5 modes. Based on Harris feature point positioning and extraction, a method of grasping posture conversion was proposed to ensure the completeness and standard of the extracted grasping features. A set of Convolutional Neural Networks (CNN) suitable for the real-time classification of the grasping intention was built to distinguish the grasping state sensed by the handle in real time, accurately determine the operator’s intention, and complete the interaction with the robot. Using a UR collaborative robot as the experimental platform and the haptic handle as the intent sensing device, the intent-behaviour mapping relationship was constructed to control the motion of the UR collaborative robot. The experimental results show that the classification accuracy of operation intention is as high as 97.87%.

Highlights

  • With the continuous development of artificial intelligence technology, mechanical products have been widely used in human life

  • In paper [2], the current human-robot interaction was mainly divided into three parts: intention detection, role assignment motion control, and behaviour feedback

  • The sensor was encapsulated into a tactile handle to sense the grasping states of the human hand in real time

Read more

Summary

INTRODUCTION

With the continuous development of artificial intelligence technology, mechanical products have been widely used in human life It is important for the reasonable use and management of robots [1] in the work of assisting operators to efficiently complete specified tasks. The simple extraction of intent information can no longer meet the work needs of operators. Robots need to obtain more precise and comprehensive intention to carry out construction operation and humanized human-robot interaction [3]. Electronic skin [16] and tactile data gloves [17] have been widely used in human-robot interaction, and have made great contributions to the dexterity and accurate extraction of grasping information. The feasibility and effectiveness of this method were verified on the UR collaborative robot platform

TACTILE SENSOR PRINCIPLE AND TACTILE HANDLE DESIGN
EXPERIMENTAL VERIFICATION
Findings
CONCLUSION
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.