Abstract

Sensing the mental state induced by different task contexts, where cognition is a focus, is as important as sensing the affective state where emotion is induced in the foreground of consciousness, because completing tasks is part of every waking moment of life. However, few datasets are publicly available to advance mental state analysis, especially those using the eye as the sensing modality with detailed ground truth for eye behaviors. In this study, we contribute a high-quality publicly accessible eye video dataset, IREye4Task, where the eyelid, pupil and iris boundary are annotated for each frame to obtain eye behaviors as responses to four different task contexts and two load levels of tasks, over more than a million frames. Meanwhile, we propose a series of eye behavior representations to provide insights into how the eye behaves during different mental states. Finally, we benchmark three mental-state recognition tasks for this dataset to demonstrate the effectiveness of the eye behavior representations. This is the first public wearable eye video dataset for mental state analysis with high quality eye landmarks and a variety of mental states, and is the first study analyzing comprehensive eye behaviors far beyond using pupil size and blink in previous studies.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call