Abstract

With the proliferation of touch-screen devices and sensors that are able to track full-body movement in real-time, consumers are experiencing a shift in how they may interface with digital content. Such interfaces sense and interpret human motion, and so interaction metaphors must be built with the body in mind. The hands are the most dexterous and among the most expressive parts of our bodies, and hence are central to the majority of modern gestural input paradigms. Making fuller use of the whole body by considering more than the hands enables a greater array of possible gestural input channels and has not been thoroughly explored. This paper discusses design considerations for somatic interactions that employ multiple degrees of bodily freedom to grant synchronous and parallel input channels. Due to the physical and spatial nature of our bodies, such interfaces are well-suited to 3D graphical applications and we present a user study comparing a traditional method of virtual object manipulation with the gestural equivalents that make use of the whole body as a multi-channel interface.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call