Abstract

The auditory channel represents a primary means of human interaction and several researches are meant to exploit this interaction modality in the design of usable interactive systems. The goal of our present research has been to exploit some interactive sonification capabilities and enhance the solely-visual version of Framy in order to convey the same information clues as those visualized on its interface. The basic version of Framy exploits a visual metaphor to provide hints about off-screen objects. Based on tactile input and non-speech sound output as alternative interaction modalities, the enhanced version of the resulting prototype is now capable to offer an appropriate tradeoff between a zoom level and the amount of information provided, and has motivated the design of multimodal interfaces that support ultimate users providing them with an additional means to access information.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.