Abstract

In this paper, we present an inertial sensor-based touch and shake metaphor for expressive control of a 3D virtual avatar in a virtual environment. An intuitive six degrees-of-freedom wireless inertial motion sensor is used as a gesture and motion control input device with a sensor fusion algorithm. The algorithm enables user hand motions to be tracked in 3D space via magnetic, angular rate, and gravity sensors. A quaternion-based complementary filter is implemented to reduce noise and drift. An algorithm based on dynamic time-warping is developed for efficient recognition of dynamic hand gestures with real-time automatic hand gesture segmentation. Our approach enables the recognition of gestures and estimates gesture variations for continuous interaction. We demonstrate the gesture expressivity using an interactive flexible gesture mapping interface for authoring and controlling a 3D virtual avatar and its motion by tracking user dynamic hand gestures. This synthesizes stylistic variations in a 3D virtual avatar, producing motions that are not present in the motion database using hand gesture sequences from a single inertial motion sensor.

Highlights

  • Rapid developments in computing technology and recent advances in user interfaces have replaced the conventional interaction tools such as keyboard and mouse

  • We show that the stylistic variations of 3D avatar motions can be quickly and generated from a single example of motion data and user-specified dynamic gestures using a single inertial motion sensor

  • We demonstrated our dynamic gesture-based interactive control interface system using the motion example of kicking and punching with the mimic gesture patterns

Read more

Summary

Introduction

Rapid developments in computing technology and recent advances in user interfaces have replaced the conventional interaction tools such as keyboard and mouse. The use of human gestures to communicate and control interactive applications such as computer games and humanoid interfaces in virtual environments is increasing. Gestural interfaces enable users to naturally interact with the virtual environment. Designing a dynamic gesture interface system to recognize a user’s actions and corresponding reaction from the application in real time is a challenging task. Gestures are meaningful body motions, and can include static or dynamic physical movements of the fingers, hands, face, or body for interaction with the environment. Gestural interfaces are currently being developed for applications such as virtual reality, sign language, and remote control using different motion tracking and gesture recognition techniques [1]

Objectives
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call