Abstract
Mid-air gestural interaction with multiple targets (or devices) in the physical space presents several challenges: gesture design gets more complex in the attempt to avoid gesture conflicts and accidental activations; the potential commands are many but the gesture set must be kept small to enable memorability; and, from a design-process perspective, there are not readily available technologies for implementation and prototyping. In this paper, we present the Address and Command (A&C) interaction model that enables two-handed mid-air interactions with multiple remote devices, in a research approach that includes gesture elicitation, prototyping, and evaluation of the user experience. The A&C model requires that users employ the non-dominant hand to address a device (address gestures) and the dominant hand to provide a command to it (command gestures). This approach affords interactions that simultaneously address and command multiple devices in ubiquitous environments that respond to mid-air input. A&C interactions also afford the design of a single gesture for the same command when applicable for multiple devices, thus decreasing the total number of gestures to be memorized and promoting end-user learning of the gesture set. We have (a) conducted an elicitation study (n = 18) to define A&C gestures for seven smart home devices and twelve commands; (b) developed a spatial augmented reality prototype that responds to mid-air gesture commands exploiting MS Kinect sensor and Visual Gesture Builder, Unity 3D and projection-mapping technologies onto foam mockups; (c) tested the approach and the gesture vocabulary in terms of usability, memorability and user experience at a scenario of 36 tasks among devices (n = 17). We found that A&C interactions are feasible, fast, error-free, easy to learn and remember, and are highly valued in terms of user experience.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.