Abstract

Emotion recognition using Machine Learning algorithms is often used both in science and commerce. Responding to the demand for deep learning techniques of automatic emotion detection using biological signals and our own business needs as a neuromarketing laboratory, we created a large dataset of eye tracking and biometrics data suitable for emotion recognition tasks. The EmoEye database sample consisted of 200 people (147 women, 49 men, 4 non-binary individuals; 27.46 ± 11.45 years old). Each respondent was asked to view 316 images from the Open Affective Standardized Image Set (OASIS) and rate them on arousal and valence scales from the Self-Assessment Manikin questionnaire. Eye tracking, galvanic skin response (GSR), and photoplethysmogram were recorded throughout the experiment. Demographic data was also collected for each respondent. The image ratings on the valence scale did not differ statistically from the standard ratings of the corresponding images for the original stimulus base. The overall distribution trends of ratings on both scales for different categories of images were similar for standard ratings and ratings obtained from our respondents. As a result of this study, a corpus of GSR, heart rate variability and eye movement reactions data (fixation coordinates; fixation duration; average pupil size for the right and left eye) was compiled and successfully trained on a multimodal neural network algorithm within our laboratory and is ready for further implementation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call