Abstract

Gestures are considered as a natural expression of the human body and are used to communicate with other people. The gesture-based human-robot interaction is natural, convenient, and applicable, and can be applied to complex interactive scenarios. In this paper, considering the real-time nature of the human-robot cooperation(HRC) system and the variability of the interaction range, we combine Kinect V2.0 (far-range sensor) and Leap Motion (short-range and high precision sensor), and propose a real-time multi-sensor gesture interaction system. Firstly, a reasonable layout of two sensors is discussed to realize far-range perception of natural gesture interaction. Then, nine gestures are defined that are easy for users to remember and operate. At the same time, a gesture interactive mechanism is proposed that can automatically switch two sensors according to the distance of the operator’s position. It can better improve the defects such as occlusion and confusion in the process of gesture controlling, and solve the minimum distance constraint of Kinect. Finally, the interactive experiment proves the stability and accuracy of the system.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call