Abstract

Emotions can be perceived from both facial and bodily expressions. Our previous study has found the successful decoding of facial expressions based on the functional connectivity (FC) patterns. However, the role of the FC patterns in the recognition of bodily expressions remained unclear, and no neuroimaging studies have adequately addressed the question of whether emotions perceiving from facial and bodily expressions are processed rely upon common or different neural networks. To address this, the present study collected functional magnetic resonance imaging (fMRI) data from a block design experiment with facial and bodily expression videos as stimuli (three emotions: anger, fear, and joy), and conducted multivariate pattern classification analysis based on the estimated FC patterns. We found that in addition to the facial expressions, bodily expressions could also be successfully decoded based on the large-scale FC patterns. The emotion classification accuracies for the facial expressions were higher than that for the bodily expressions. Further contributive FC analysis showed that emotion-discriminative networks were widely distributed in both hemispheres, containing regions that ranged from primary visual areas to higher-level cognitive areas. Moreover, for a particular emotion, discriminative FCs for facial and bodily expressions were distinct. Together, our findings highlight the key role of the FC patterns in the emotion processing, indicating how large-scale FC patterns reconfigure in processing of facial and bodily expressions, and suggest the distributed neural representation for the emotion recognition. Furthermore, our results also suggest that the human brain employs separate network representations for facial and bodily expressions of the same emotions. This study provides new evidence for the network representations for emotion perception and may further our understanding of the potential mechanisms underlying body language emotion recognition.

Highlights

  • Humans can readily recognize others’ emotions and make the corresponding reactions

  • Adopting functional connectivity (FC)-based multivariate pattern analysis (MVPA), we showed that emotions perceiving from facial and bodily expressions can be successfully decoded from the large-scale FC patterns

  • Taken together, using fcMVPA-based classification analyses, we show that rich emotional information is represented in the largescale FC patterns, which can accurately decode facial and bodily expressions

Read more

Summary

Introduction

Humans can readily recognize others’ emotions and make the corresponding reactions. In daily communications, emotions can be perceived from facial and bodily expressions. An earlier model for face perception was proposed by Haxby et al (2000) and Gobbini and Haxby (2007), which consisted of a “core” and an “extended” system. These face-selective areas, especially the occipital face area (OFA), the fusiform face area (FFA), and the posterior superior temporal sulcus (pSTS), which together constituted the core face network, have been considered as key regions in charge of processing the identity and emotional features of the face (Grill-Spector et al, 2004; Ishai et al, 2005; Lee et al, 2010; Gobbini et al, 2011). Recent studies have proposed that the STS participated in the processing of facial and bodily motions, postures, and emotions (Candidi et al, 2011; Zhu et al, 2013)

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call