Abstract

BackgroundRecognition of facial expressions (FEs) plays a crucial role in social interactions. Most studies on FE recognition use static (image) stimuli, even though real-life FEs are dynamic. FE processing is complex and multifaceted, and its neural correlates remain unclear. Transitioning from static to dynamic FE stimuli might help disentangle the neural oscillatory mechanisms underlying face processing and recognition of emotion expression. To our knowledge, we here present the first time–frequency exploration of oscillatory brain mechanisms underlying the processing of dynamic FEs.ResultsVideos of joyful, fearful, and neutral dynamic facial expressions were presented to 18 included healthy young adults. We analyzed event-related activity in electroencephalography (EEG) data, focusing on the delta, theta, and alpha-band oscillations. Since the videos involved a transition from neutral to emotional expressions (onset around 500 ms), we identified time windows that might correspond to face perception initially (time window 1; first TW), and emotion expression recognition subsequently (around 1000 ms; second TW). First TW showed increased power and phase-locking values for all frequency bands. In the first TW, power and phase-locking values were higher in the delta and theta bands for emotional FEs as compared to neutral FEs, thus potentially serving as a marker for emotion recognition in dynamic face processing.ConclusionsOur time–frequency exploration revealed consistent oscillatory responses to complex, dynamic, ecologically meaningful FE stimuli. We conclude that while dynamic FE processing involves complex network dynamics, dynamic FEs were successfully used to reveal temporally separate oscillation responses related to face processing and subsequently emotion expression recognition.

Highlights

  • Recognition of facial expressions (FEs) plays a crucial role in social inter‐ actions

  • These results confirm that dynamic FE stimuli from different categories affected participants differently and were recognized as conveying different emotions

  • The cardinal findings of the present study were as follows: (I) an early time window had higher power and phase-locking values than a later time window for all frequency bands; (II) delta and theta power were higher in response to the emotional FEs than the neutral, in the second TW; (III) delta and theta phase-locking values were higher in response to the fearful FEs than to joyful and neutral FEs, in the second TW; and (IV) the right parietal locations had higher power and phase-locking value in the first TW, especially for the theta and alpha frequencies, while there was no such differentiation in the second TW

Read more

Summary

Introduction

Recognition of facial expressions (FEs) plays a crucial role in social inter‐ actions. Transitioning from static to dynamic FE stimuli might help disentangle the neural oscillatory mechanisms underlying face processing and recognition of emo‐ tion expression. We here present the first time–frequency explora‐ tion of oscillatory brain mechanisms underlying the processing of dynamic FEs. Recognition of facial expressions (FE) is central to human social interactions. Facial expressions of basic emotions can be recognized irrespective of culture or geographical location [1,2,3], underlining the evolutionary value of recognizing others’ emotions and inferring their intentions This ability to recognize emotions in FEs can be impaired. Event-related delta and theta oscillations are stronger in response to emotional than neutral FEs [18, 20, 21, 23, 24] and seem to be involved in both non-conscious and conscious aspects of FE processing [20]. Güntekin and Başar [26] showed increased temporoparietal alpha oscillations in response to angry FE

Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.