Abstract

Human facial expressions are regarded as a vital indicator of one’s emotion and intention, and even reveal the state of health and wellbeing. Emotional states have been associated with information processing within and between subcortical and cortical areas of the brain, including the amygdala and prefrontal cortex. In this study, we evaluated the relationship between spontaneous human facial affective expressions and multi-modal brain activity measured via non-invasive and wearable sensors: functional near-infrared spectroscopy (fNIRS) and electroencephalography (EEG) signals. The affective states of twelve male participants detected via fNIRS, EEG, and spontaneous facial expressions were investigated in response to both image-content stimuli and video-content stimuli. We propose a method to jointly evaluate fNIRS and EEG signals for affective state detection (emotional valence as positive or negative). Experimental results reveal a strong correlation between spontaneous facial affective expressions and the perceived emotional valence. Moreover, the affective states were estimated by the fNIRS, EEG, and fNIRS + EEG brain activity measurements. We show that the proposed EEG + fNIRS hybrid method outperforms fNIRS-only and EEG-only approaches. Our findings indicate that the dynamic (video-content based) stimuli triggers a larger affective response than the static (image-content based) stimuli. These findings also suggest joint utilization of facial expression and wearable neuroimaging, fNIRS, and EEG, for improved emotional analysis and affective brain–computer interface applications.

Highlights

  • The face has long been considered as a window with a view to our emotions [1]

  • To assess the spontaneous affective states estimated through brain activity, first we evaluate the Interval reliability of theModel automaticObservation facial emotion recognition systemError that is 95%

  • We have demonstrated that affective states can be estimated from human spontaneous facial expressions and brain activity via wearable sensors

Read more

Summary

Introduction

Facial expressions are regarded as one of the most natural and efficient cues enabling people to interact and communicate with others in a nonverbal manner [2]. With the systematic analysis of facial expression [3], the link between facial expression and emotion has been demonstrated empirically in psychology literature [1,4]. Decades of behavioral research revealed that facial expression carries information for a wide-range of phenomena, from psychopathology to consumer preferences [5,6,7]. The recent advances in electronics and computational technologies allow recording facial expressions at increasingly high resolutions and advanced the analysis performance. A better understanding of facial expressions can contribute to human-computer interactions and emerging practical applications

Methods
Findings
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call