Abstract

With the resurgence of Head Mounted Displays (HMDs), in-air gestures form a natural and intuitive interaction mode of communication. HMDs such as Microsoft Hololens, Daqri smart-glasses etc., have on-board processors with additional sensors, making the device expensive. Our goal, therefore, is to enable mass-market reach by extending the interaction space around mobile devices for Augmented/Virtual reality: with just frugal head-mounts such as Google Cardboard and Wearality with a smartphone. One necessary step for a feasible human-computer interaction is gesture recognition, preceded by a reliable hand segmentation in egocentric view. We propose a technique for real-time hand segmentation that utilize only RGB camera in an off-the-shelf mobile device. The novelty of our work lies in coming up with a filtering technique that term Multi Orientation Matched filter, used for hand segmentation that works on-device, even in situations of skin-like background. We have extensively tested our hand segmentation method on public datasets. We provide comparisons of hand segmentation with the existing methods under the same limitations. We demonstrate that our method outperforms in term of computational time and comparable results in term of accuracy. Further, we demonstrate that using our solution a zoom gesture classification can work in real-time on android smartphones.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call