Abstract

Natural user interfaces based on hand gestures are becoming increasingly popular. The need for expensive hardware left a wide range of interaction possibilities that hand tracking enables largely unexplored. Recently, hand tracking has been built into inexpensive and widely available hardware, allowing more and more people access to this technology. This work provides researchers and users with a simple yet effective way to implement various one-handed gestures to enable deeper exploration of gesture-based interactions and interfaces. To this end, this work provides a framework for design, prototyping, testing, and implementation of one-handed gestures. The proposed framework was implemented with two main goals: First, it should be able to recognize any one-handed gesture. Secondly, the design and implementation of gestures should be as simple as performing the gesture and pressing a button to record it. The contribution of this paper is a simple yet unique way to record and recognize static and dynamic one-handed gestures. A static gesture can be captured with a template matching approach, while dynamic gestures use previously captured spatial information. The presented approach was evaluated in a user study with 33 participants and the implementable gestures received high accuracy and user acceptance.

Highlights

  • For Augmented (AR), Virtual (VR), and Mixed Reality (MR) research and its applications, gesture recognition and hand-gesture-based interfaces are becoming increasingly important

  • The time until recognition is interesting to find out which gestures were not easy to imitate by users

  • This might be an indication that gesture recognition configuration was too strict for this particular gesture and could be loosened in order to improve recognition time

Read more

Summary

Introduction

For Augmented (AR), Virtual (VR), and Mixed Reality (MR) research and its applications, gesture recognition and hand-gesture-based interfaces are becoming increasingly important. The importance of hand-based interaction is steadily growing, and hand gestures are more frequently used in various application and research scenarios. Devices built to support hand tracking usually come with a Software Development Kit (SDK) that provides visualization and simple interactions with virtual objects using hands within immersive virtual environments. The MRTK relies mostly on virtual objects such as menus, buttons, and sliders for interaction This SDK should not be considered as a framework for hand gestures but rather a whole toolkit to enable interaction with virtual objects by using natural input, such as hands or eye gaze

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call