Abstract

This paper explores how multimodal interfaces make it easier for people with sensory impairments to interact with mobile terminals such as PDAs and 3rd generation mobile phones (3G/UMTS). We have developed a flexible speech centric composite multimodal interface to a map-based information service on a mobile terminal. This user interface has proven useful for different types of disabilities, from persons with muscular atrophy combined with some minor speaking problems to a severe dyslectic and an aphasic. Some of the test persons did not manage to use the ordinary public information service, neither on the web (text only) nor by calling a manual operator phone (speech only). But they fairly easily employed our multimodal interface by pointing at the map on the touch screen while uttering short commands or phrases. Although this is a limited qualitative evaluation it indicates that development of speech centric multimodal interfaces to information services is a step in the right direction for achieving the goal of design for all.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call