Abstract
As the ubiquitous computing paradigm that is predicted for the future is brought closer by technological advances, the need for new interaction techniques also arises in entertainment contents, mediated spaces, and sentient computer systems. The purpose of this study is to propose the method to recognize gestures based on inertia sensors which recognize the motions of the user by comparing the recognized motions with the pre-defined motions for game content production. Additionally, this method provides users with various data entry methods by making them wear small controllers with three-axis accelerator sensors. Users can proceed with the experiential game by moving according to the action list printed on the screen and the accuracy and timing of their motions. If they use multiple small wireless controllers together wearing them on the major parts of the body (hands and feet) and utilize the proposed methods, the game will be more interesting.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.