Abstract

Objective: Electroencephalography (EEG) and eye tracking can possibly provide information about which items displayed on the screen are relevant for a person. Exploiting this implicit information promises to enhance various software applications. The specific problem addressed by the present study is that items shown in real applications are typically diverse. Accordingly, the saliency of information, which allows to discriminate between relevant and irrelevant items, varies. As a consequence, recognition can happen in foveal or in peripheral vision, i.e., either before or after the saccade to the item. Accordingly, neural processes related to recognition are expected to occur with a variable latency with respect to the eye movements. The aim was to investigate if relevance estimation based on EEG and eye tracking data is possible despite of the aforementioned variability.Approach:Sixteen subjects performed a search task where the target saliency was varied while the EEG was recorded and the unrestrained eye movements were tracked. Based on the acquired data, it was estimated which of the items displayed were targets and which were distractors in the search task.Results: Target prediction was possible also when the stimulus saliencies were mixed. Information contained in EEG and eye tracking data was found to be complementary and neural signals were captured despite of the unrestricted eye movements. The classification algorithm was able to cope with the experimentally induced variable timing of neural activity related to target recognition.Significance: It was demonstrated how EEG and eye tracking data can provide implicit information about the relevance of items on the screen for potential use in online applications.

Highlights

  • Electroencephalography (EEG) and eye tracking can potentially be used to estimate which items displayed on the screen are relevant for the user

  • Target Estimation with EEG and Eye Tracking Features. It was estimated which items displayed on the screen were targets of the search task based on EEG and eye tracking data

  • Studies tackling several of the problems mentioned are in preparation. It was demonstrated how EEG and eye tracking can provide information about which items displayed on the screen are relevant in a search task with unconstrained eye movements and a mixed item saliency

Read more

Summary

Introduction

Electroencephalography (EEG) and eye tracking can potentially be used to estimate which items displayed on the screen are relevant for the user. Exploiting this implicit information promises to enhance different types of applications and could, e.g., serve as additional input to computer software next to mouse and keyboard (cf Hajimirza et al, 2012; Eugster et al, 2014, for the single modalities). In BCI experiments, stimuli are typically flashed on the screen and, the timing of stimulus recognition is precisely known This information can not be expected in common software applications, where several possibly important items are displayed in parallel and not flashed in succession. EEG and eye tracking were measured in parallel to study eye-fixation-related potentials during reading (e.g., Baccino and Manunta, 2005; Dimigen et al, 2011, 2012) and search tasks (Sheinberg and Logothetis, 2001; Luo et al, 2009; Pohlmeyer et al, 2010, 2011; Rämä and Baccino, 2010; Dandekar et al, 2012; Kamienkowski et al, 2012; Brouwer et al, 2013; Dias et al, 2013; Kaunitz et al, 2014)

Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call