Abstract
The investigation of how humans perceive and respond to emotional signals conveyed by the human body has been for a long time secondary compared with the investigation of facial expressions and emotional scenes recognition. The aims of this behavioral study were to assess the ability to process emotional body postures and to test whether motor response is mainly driven by the emotional content of the picture or if it is influenced by motor resonance. Emotional body postures and scenes (IAPS) divided into three clusters (fear, happiness, and neutral) were shown to 25 healthy subjects (13 males, mean age ± SD: 22.3 ± 1.8 years) in a three-alternative forced choice task. Subjects were asked to recognize the emotional content of the pictures by pressing one of three keys as fast as possible in order to estimate response times (RTs). The rating of valence and arousal was also performed. We found shorter RTs for fearful body postures as compared with happy and neutral postures. In contrast, no differences across emotional categories were found for the IAPS stimuli. Analysis on valence and arousal and the subsequent item analysis showed an excellent reliability of the two sets of images used in the experiment. Our results show that fearful body postures are rapidly recognized and processed, probably thanks to the automatic activation of a series of central nervous system structures orchestrating the defensive threat reactions, strengthening and supporting previous neurophysiological and behavioral findings in body language processing.
Highlights
The investigation of how humans perceive and respond to emotional signals conveyed by body expressions has been for a long time secondary compared with research addressing the recognition of emotional faces or emotional scenes (De Gelder, 2009; de Gelder et al, 2010)
The repeated-measures analysis of variance (rmANOVA) showed a significant main effect of PICTURE (F1, 24 = 7.907; p = 0.01; pη2 = 0.248), with a higher accuracy observable for International Affective Picture System (IAPS) compared with posture (94.8 ± 1.4 and 92.6 ± 1.6%, respectively), but no main effect of CATEGORY or PICTURE∗CATEGORY interaction (F < 1 and p > 0.05)
The rmANOVA showed a significance a main effect of EMOTION (F2, 48 = 14.282; p < 0.01; pη2 = 0.373) indicating that accuracy was lower with fear (85.9 ± 3.0%) compared with happiness (91.3 ± 2.5%; p = 0.05) and neutral stimuli (96.2 ± 1.6%; p < 0.01) and lower with happiness compared with neutral stimuli (p = 0.05)
Summary
The investigation of how humans perceive and respond to emotional signals conveyed by body expressions has been for a long time secondary compared with research addressing the recognition of emotional faces or emotional scenes (De Gelder, 2009; de Gelder et al, 2010). Processing facial and bodily emotional expressions spontaneously induces motor mimicry in the observer (Huis In ’t Veld et al, 2014; Ross and Atkinson, 2020), a mechanism that can contribute to accurate emotion recognition (Oberman et al, 2007; Wood et al, 2016; Borgomaneri et al, 2020a) These studies suggest that perceiving others’ emotional expressions involves a simulation of motor plans and associated sensory representations engaged when making the same expressions (Adolphs et al, 2000; Niedenthal et al, 2010; Huis In ’t Veld et al, 2014; Paracampo et al, 2017; Ross and Atkinson, 2020), reflecting a simulation of whole-body state associated with the emotion (Ross and Atkinson, 2020). Such motor activations may reflect sensorimotor simulation and/or the activation of motivational tendencies which facilitate emotionally congruent behavior, with positive stimuli activating the approach tendencies and negative stimuli activating the avoidance tendencies (Lang et al, 1990; Ekman and Davidson, 1995; Lang and Bradley, 2010)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.