Abstract

With the recent developments in wearable augmented reality (AR), the role of natural human---computer interaction is becoming more important. Utilization of auxiliary hardware for interaction introduces extra complexity, weight and cost to wearable AR systems and natural means of interaction such as gestures are therefore more desirable. In this paper, we present a novel multi-cue hand detection and tracking method for head-mounted AR systems which combines depth, color, intensity and curvilinearity. The combination of different cues increases the detection rate, eliminates the background regions and therefore increases the tracking performance under challenging conditions. Detected hand positions and the trajectories are used to perform actions such as click, select, etc. Moreover, the 6 DOF poses of the hands are calculated by approximating the segmented regions with planes in order to render a planar menu (interface) around the hand and use the hand as a planar selection tool. The proposed system is tested on different scenarios (including markers for reference) and the results show that our system can detect and track the hands successfully in challenging conditions such as cluttered and dynamic environments and illumination variance. The proposed hand tracker outperforms other well-known hand trackers under these conditions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call