Abstract

Using a simple approach, we demonstrate that eye gestures could provide a highly accurate interaction modality in a mixed reality environment. Such interaction has been proposed for desktop and mobile devices. Recently, Gaze gesture has gained a special interest in Human-Computer Interaction and granted new interaction possibilities, particularly for accessibility. We introduce a new approach to investigate how gaze tracking technologies could help people with ALS or other motor impairments to interact with computing devices. In this paper, we propose a touch-free, eye movement based entry mechanism for mixed reality environments that can be used without any prior calibration. We evaluate the usability of the system with 7 participants, describe the implementation of the method and discuss its advantages over traditional input modalities.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call