Abstract

In this paper, we present a novel real-time physical interaction system that allows users to throw a virtual ball without using an intermediary device such as a controller. A dataset of throwing motions was captured from six actors, from which ground truth Point of Release (PoR) frames were calculated. We trained a PoR prediction model using motion features extracted from arm joints, and developed a detection algorithm to predict the PoR of a throwing motion in real-time. Evaluation of the system with pre-recorded throwing motion data results in detection errors of less than 50 ms. Qualitative results from six users of the real-time system in VR indicate that the task of throwing without a controller was very natural, although the system performed better for underarm than overarm throws. Finally, we report the relative importance of different joints and motion features for the PoR prediction task.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call