Abstract
This work introduces a new multispectral database and framework to train and evaluate eyeblink detection in RGB and Near-Infrared (NIR). Our contributed dataset (mEBAL2, multimodal EyeBlink and Attention Level estimation, Version 2) is the largest existing eyeblink database, representing a great opportunity to improve data-driven multispectral approaches for blink detection and related applications (e.g., attention level estimation). mEBAL2 includes 21,100 image sequences from 180 different students (more than 2 million labeled images in total) while conducting a number of e-learning tasks of varying difficulty or taking a real course on HTML initiation through the edX MOOC platform. mEBAL2 uses multiple sensors, including two Near-Infrared (NIR) and one RGB camera to capture facial gestures during the execution of the tasks, as well as an Electroencephalogram (EEG) band to get the cognitive activity of the user and blinking events. Furthermore, this work proposes 3 data-driven approaches as benchmarks for blink detection on mEBAL2, where the architecture based on Convolutional Long Short-Term Memory (ConvLSTM) achieved performances of up to 99%. The experiments explored whether combining RGB and NIR spectrum data improves blink detection in training and architectures that merge both types of data. Experiments showed that the NIR spectrum enhances results, even when only RGB images are available during inference. Finally, the generalization capacity of the proposed eyeblink detectors, along with state-of-the-art eyeblink detection implementations, is validated in wilder and more challenging environments like the HUST-LEBW dataset to show the usefulness of mEBAL2 to train a new generation of data-driven approaches for eyeblink detection.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.