Abstract
Orientation sensing is considered an important means to implement embedded technology enhanced artifacts (often referred to as ‘smart artifacts’), exhibiting embodied means of interaction based on their position, orientation, and the respective dynamics. Considering artifacts subject to manual (or ‘by-hand’) manipulation by the user, we identify hand worn, hand carried and (hand) graspable real world objects as exhibiting different artifact orientation dynamics, justifying an analysis along these three categories. We refer to orientation dynamics as ‘gestures’ in an abstract sense, and present a general framework for orientation sensor based gesture recognition. The framework specification is independent of sensor technology and classification methods, and elaborates an application-independent set of gestures. It enables multi sensor interoperability and it accommodates a variable number of sensors. A core component of the framework is a gesture library that contains gestures from three categories: hand gestures, gestures of artifact held permanently and gestures of artifact that are detached from the hand and are manipulated occasionally. An inertial orientation sensing based gesture detection and recognition system is developed and composed into a gesture-based interaction development framework. The use of this framework is demonstrated with the development of tangible remote controls for a media player, both in hardware and in software.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.