Abstract

Social interactions arise from patterns of communicative signs, whose perception and interpretation require a multitude of cognitive functions. The semiotic framework of Peirce’s Universal Categories (UCs) laid ground for a novel cognitive-semiotic typology of social interactions. During functional magnetic resonance imaging (fMRI), 16 volunteers watched a movie narrative encompassing verbal and non-verbal social interactions. Three types of non-verbal interactions were coded (“unresolved,” “non-habitual,” and “habitual”) based on a typology reflecting Peirce’s UCs. As expected, the auditory cortex responded to verbal interactions, but non-verbal interactions modulated temporal areas as well. Conceivably, when speech was lacking, ambiguous visual information (unresolved interactions) primed auditory processing in contrast to learned behavioral patterns (habitual interactions). The latter recruited a parahippocampal-occipital network supporting conceptual processing and associative memory retrieval. Requesting semiotic contextualization, non-habitual interactions activated visuo-spatial and contextual rule-learning areas such as the temporo-parietal junction and right lateral prefrontal cortex. In summary, the cognitive-semiotic typology reflected distinct sensory and association networks underlying the interpretation of observed non-verbal social interactions.

Highlights

  • During social interactions, a multitude of auditory and visual cues interact to convey meaning

  • The most consistently reported neural correlates of social-cognitive functions encompass superior temporal gyrus (STG), temporo-parietal junction (TPJ), medial prefrontal cortex (PFC), fusiform gyrus, and precuneus (Iacoboni et al, 2004; Wolf et al, 2010; Lahnakoski et al, 2012; Wagner et al, 2016)

  • The activation differences were characterized by a gradient from unresolved to non-habitual to habitual interactions: In post hoc t-tests, unresolved interactions yielded higher activity as compared to habitual interactions (t15 = 5.28, p < 0.001) and to non-habitual interactions (t15 = 2.60, p = 0.02; habitual versus non-habitual: t15 = 0.98, p = 0.344, n.s.)

Read more

Summary

Introduction

A multitude of auditory and visual cues interact to convey meaning. Interpreting social interactions requires the interaction of various cognitive functions; among these are social attention mechanisms, mentalizing, language comprehension, and the recognition of faces, communicative gestures, goal-directed movements, and emotions. The posterior STG and TPJ of the right hemisphere serve as key regions during the processing of real-life social interactions and joint attention (Redcay et al, 2010; for a meta-analysis see Krall et al, 2015). Both regions may contribute to the analysis of social relations in movie clips (Iacoboni et al, 2004). The significance of several cortical and subcortical structures for the processing of social interaction has become evident, but their specific contribution to the interpretation of interaction events, especially during naturalistic stimulation, remains to be determined

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call