Abstract

Man‐Machine interaction through hand gestures is a rich, natural, and intuitive tool to control virtual and real environment. This paper proposes a vision‐based hand gesture interface (VBHGI) to remotely control a robot arm through the web. A VBHGI requires real time and robust hand detection and gesture recognition. This recognition is carried out in three phases: acquisition, segmentation and identification of the hand posture. Since we are not using gloves or markers, we propose appropriate motion detection and segmentation. For the identification phase, we opted for principal component analysis in order to better represent the classes of gesture in reduced spaces. Once the gesture is recognized, it is analyzed to be used as an articulation command to remotely control a robot arm end effector.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call