Abstract

In this paper, an implementation of a virtual reality based application for drumkit simulation is presented. The system tracks user motion through the use of a Kinect camera sensor, and recognizes and detects user-generated drum-hitting gestures in real-time. In order to compensate the effects of latency in the sensing stage and provide real-time interaction, the system uses a gesture detection model to predict user movements. The paper discusses the use of two different machine learning based solutions to this problem: the first one is based on the analysis of velocity and acceleration peaks, the other solution is based on Wiener filtering. This gesture detector was tested and integrated into a full implementation of a drumkit simulator, capable of discriminating up to 3, 5 or 7 different drum sounds. An experiment with 14 participants was conducted to assess the system's viability and impact on user experience and satisfaction.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call