We appraise other people's emotions by combining multiple sources of information, including somatic facial/body reactions and the surrounding context. A wealthy literature revealed how people take into account contextual information in the interpretation of facial expressions, but the mechanisms mediating such influence still need to be duly investigated. Across two experiments, we mapped the neural representations of distinct (but comparably unpleasant) negative states, pain and disgust, as conveyed by naturalistic facial expressions or contextual sentences. Negative expressions led to shared activity in fusiform gyrus and superior temporal sulcus. Instead, pain contexts recruited supramarginal, postcentral and insular cortex, whereas disgust contexts triggered the temporo-parietal cortex and hippocampus/amygdala. When pairing the two sources of information together, we found higher likelihood of classifying an expression according to the sentence preceding it. Furthermore, networks specifically involved in processing contexts were re-enacted whenever a face followed said context. Finally, the perigenual medial prefrontal cortex showed increased activity for consistent (vs inconsistent) face-contexts pairings, suggesting that it integrates state-specific information from the two sources. Overall, our study reveals the heterogeneous nature of face-context information integration, which operates both according to a state-general and state-specific principle, with the latter mediated by the perigenual medial prefrontal cortex.Significance Statement With the aid of controlled database and a comprehensive paradigm, our study provides new insights of the brain and behavioral processes mediating contextual influences on face emotion-specific processing. Our results reveal that context operates both in face-independent and face-conditional fashion, by biasing the interpretation of any face towards the state implied by associated context, and also triggering processes that monitor the consistency between the different sources of information. Overall, our study unveils key neural processes underlying the coding of state-specific information from both face and context and sheds new light on how they are integrated within the medial prefrontal cortex.
Read full abstract