Abstract

Motion-based control of video games has gained significant attention from both academic and industrial research groups for the unique interactive experiences it offers. Of particular research interest has been the control of games through gesture-based interfaces enabled by 3D cameras that have recently been made affordable. However, existing research has yet to combine the benefits of a 3D camera with those of a physical game controller in a way that uses accurate gesture and controller tracking to provide six degrees of freedom and one-to-one correspondence between the real-world 3D space and the virtual environment. This paper presents a natural man-machine interaction method whereby a user is able to control a virtual space by using one hand to perform gestures and the other hand to wield a physical controller. The data returned from a custom 3D depth camera is used to obtain not only hand gestures (number of fingers and their angles), but also the absolute position of the physical controller. This 3D data is then combined with the orientation data returned by the accelerometers and gyroscopes within the physical controller. The controller data is fused in real-time into a composite transformation matrix that is applied to a 3D object. Two game prototypes are presented that combine hand gestures and a physical controller to create an entirely new level of interactive gaming.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call