Abstract

Human facial expressions can be recognized rapidly and effortlessly. However, for intense emotions from real life, positive and negative facial expressions are difficult to discriminate and the judgment of facial expressions is biased towards simultaneously perceived body expressions. This study employed event-related potentials (ERPs) to investigate the neural dynamics involved in the integration of emotional signals from facial and body expressions of victory and defeat. Emotional expressions of professional players were used to create pictures of face-body compounds, with either matched or mismatched emotional expressions in faces and bodies. Behavioral results showed that congruent emotional information of face and body facilitated the recognition of facial expressions. ERP data revealed larger P1 amplitudes for incongruent compared to congruent stimuli. Also, a main effect of body valence on the P1 was observed, with enhanced amplitudes for the stimuli with losing compared to winning bodies. The main effect of body expression was also observed in N170 and N2, with winning bodies producing larger N170/N2 amplitudes. In the later stage, a significant interaction of congruence by body valence was found on the P3 component. Winning bodies elicited lager P3 amplitudes than losing bodies did when face and body conveyed congruent emotional signals. Beyond the knowledge based on prototypical facial and body expressions, the results of this study facilitate us to understand the complexity of emotion evaluation and categorization out of laboratory.

Highlights

  • Both face and body play important roles in conveying emotional information

  • It is found that the amygdala begins to respond to fearful faces as early as 40 ms after stimulus onset [2, 3]; and the event-related potential (ERP) components of lateral occipital P1 peaks at approximately 100–120 ms after emotional faces [4,5,6,7,8] and bodies [9] being displayed

  • We focused on the emotional processing of intense face-body expressions and investigated the influence of emotional body language on the processing of emotional faces

Read more

Summary

Introduction

Both face and body play important roles in conveying emotional information. It is found that the amygdala begins to respond to fearful faces as early as 40 ms after stimulus onset [2, 3]; and the event-related potential (ERP) components of lateral occipital P1 peaks at approximately 100–120 ms after emotional faces [4,5,6,7,8] and bodies [9] being displayed. Studies have demonstrated that the early P1 reflects the effect of emotion, usually with enhanced amplitudes in response to fearful compared to happy, sad, and neutral faces [7, 10] and bodies [9].

Objectives
Methods
Results
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.