Abstract

This paper presents the development of a Human-Robot Interface (HRI) that uses an RGB-D sensor to facilitate industrial robot programming. The main aims and functionalities of the HRI are described and subsequently emphasis is placed on two novel user interactions methods through which the operator’s movements and gestures are captured and used to position the joints of a 6 DOF articulated robot arm. In particular, during the first user interaction method the operator’s torso, shoulder and elbow are used to determine the position of the first three robot joints, while the operator’s gestures are used to determine the position of the remaining three joints. During the second user interaction method, a combination of the operator’s movements and gestures are used to select any desirable robot joint and then modify its angle by a user-configurable step. Both interaction methods also allow the operator to control robot speed and gripper position. The HRI is complemented by a virtual model of the robotic arm, created in Unity3D, to evaluate online programming capabilities. Results from using the two developed user interaction methods to position the robot in pre-defined poses are presented and discussed to highlight strengths and weaknesses of each method. Finally, the obtained conclusions are presented and suggestions for future work are made.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.