Abstract

EyeGrip proposes a novel and yet simple technique of analysing eye movements for automatically detecting the user's objects of interest in a sequence of visual stimuli moving horizontally or vertically in front of the user's view. We assess the viability of this technique in a scenario where the user looks at a sequence of images moving horizontally on the display while the user's eye movements are tracked by an eye tracker. We conducted an experiment that shows the performance of the proposed approach. We also investigated the influence of the speed and maximum number of visible images in the screen, on the accuracy of EyeGrip. Based on the experiment results, we propose guidelines for designing EyeGrip-based interfaces. EyeGrip can be considered as an implicit gaze interaction technique with potential use in broad range of applications such as large screens, mobile devices and eyewear computers. In this paper, we demonstrate the rich capabilities of EyeGrip with two example applications: 1) a mind reading game, and 2) a picture selection system. Our study shows that by selecting an appropriate speed and maximum number of visible images in the screen the proposed method can be used in a fast scrolling task where the system accurately (87%) detects the moving images that are visually appealing to the user, stops the scrolling and brings the item(s) of interest back to the screen.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call