Abstract

Portable depth-sensing cameras allow users to control interfaces using hand gestures at a short range from the camera. These technologies are being combined with virtual reality (VR) headsets to produce immersive VR experiences that respond more naturally to user actions. In this research, we explore gesture-based interaction in immersive VR games by using the Unity game engine, the LeapMotion sensor, a laptop, a smartphone, and the Freefly VR headset. By avoiding Android deployment, this novel setup allowed for fast prototyping and testing of different ideas for immersive VR interaction, at an affordable cost. We implemented a system that allows users to play a game in a virtual world and compared placements of the leap motion sensor on the desk and on the headset. In this experimental setup, users interacted with a numeric dial panel and then played a Tetris game inside the VR environment by pressing the buttons of a virtual panel. The results suggest that, although the tracking quality of the Leap Motion sensor was rather limited when used in the head-mounted setup for pointing and selection tasks, its performance was much better in the desk-mounted setup, providing a novel platform for research and rapid application development.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call