Abstract

Eye movements can carry a rich set of information about someone's intentions. In the case of physically impaired people gaze can be the only communication channel they can use. People with severe disabilities are usually assisted by helpers during everyday life activity, which in time can lead to a development of an effective visual communication protocol between helper and disabled. This protocol allows them to communicate at some extent only by glancing one towards the other. Starting from this premise, we propose a new model of attentive user interface featured with some of the visual comprehension abilities of a human helper. The purpose of this user interface is to be able to identify user's intentions, and so to assist him/her in the process of achieving simple interaction goals (i.e. object selection, task selection). Implementation of this attentive interface is accomplished by way of statistical analysis of user's gaze data, based on a hidden Markov model.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call