Abstract

The development of robots, which can safely and effectively interact with people and assist them in structured environments, is an open research problem whose importance has been growing rapidly in the last years. Indeed working in shared environments with human beings, these robots require new ways to achieve human–robot interaction and cooperation. This work presents an approach for performing human–robot interaction by means of robotic manipulators. The interaction is composed by three main steps, namely the selection, the recognition and the grasping of an object. The object selection is recorded on the base of a gesture execution, realized by the user in front of a RGB-D camera and associated to each particular object. The object recognition is achieved by means of the RGB cameras mounted on the two manipulator arms, which send the workspace information to a specific classifier. With the aim of realizing the grasping step, the object position and orientation are extracted in order to correctly rotate the gripper according to the object on the desk in front of the robot. The final goal is to release the grasped object on the hand of the user standing in front of the desk. This system could support people with limited motor skills who are not able to get an object on their own, playing an important role in structured assistive and smart environments, thus promoting the human–robot interaction in activity of daily living.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.