Abstract

Technical advances, particularly the integration of wearable and embedded sensors, facilitate tracking of physiological responses in a less intrusive way. Currently, there are many devices that allow gathering biometric measurements from human beings, such as EEG Headsets or Health Bracelets. The massive data sets generated by tracking of EEG and physiology may be used, among other things, to infer knowledge about human moods and emotions. Apart from direct biometric signal measurement, eye tracking systems are nowadays capable of determining the point of gaze of the users when interacting in ICT environments, which provides an added value research on many different areas, such as psychology or marketing. We present a process in which devices for eye tracking, biometric, and EEG signal measurements are synchronously used for studying both basic and complex emotions. We selected the least intrusive devices for different signal data collection given the study requirements and cost constraints, so users would behave in the most natural way possible. On the one hand, we have been able to determine basic emotions participants were experiencing by means of valence and arousal. On the other hand, a complex emotion such as empathy has also been detected. To validate the usefulness of this approach, a study involving forty-four people has been carried out, where they were exposed to a series of affective stimuli while their EEG activity, biometric signals, and eye position were synchronously recorded to detect self-regulation. The hypothesis of the work was that people who self-regulated would show significantly different results when analyzing their EEG data. Participants were divided into two groups depending on whether Electro Dermal Activity (EDA) data indicated they self-regulated or not. The comparison of the results obtained using different machine learning algorithms for emotion recognition shows that using EEG activity alone as a predictor for self-regulation does not allow properly determining whether a person in self-regulation its emotions while watching affective stimuli. However, adequately combining different data sources in a synchronous way to detect emotions makes it possible to overcome the limitations of single detection methods.

Highlights

  • Emotion is a subjective experience characterized by psychophysiological expressions, biological reactions, and mental states

  • With the data collected by Emotiv Headset, it can be observed that the right hemisphere was clearly generating more alpha waves between 7 and 10 Hz when users saw unpleasant images

  • All these results can be related to Galvanic Skin Response (GSR) and Blood Volume Pulse (BVP) measures too, captured by the Empatica device

Read more

Summary

Introduction

Emotion is a subjective experience characterized by psychophysiological expressions, biological reactions, and mental states. Ekman (1999) as part of his valuable legacy, established the foundations for measuring emotions. From a psychological point of view, an emotion is a complex psychological state that involves three distinct components: a subjective experience, a physiological response, and a behavioral or expressive response (Hockenbury and Hockenbury, 2007). During his research, he discovered that basic emotions exist and are culturally independent. In which two different emotional and cognitive facets can be determined (Davis et al, 1987)

Objectives
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.