Abstract

Successful social communication draws strongly on the correct interpretation of others' body and vocal expressions. Both can provide emotional information and often occur simultaneously. Yet their interplay has hardly been studied. Using electroencephalography, we investigated the temporal development underlying their neural interaction in auditory and visual perception. In particular, we tested whether this interaction qualifies as true integration following multisensory integration principles such as inverse effectiveness. Emotional vocalizations were embedded in either low or high levels of noise and presented with or without video clips of matching emotional body expressions. In both, high and low noise conditions, a reduction in auditory N100 amplitude was observed for audiovisual stimuli. However, only under high noise, the N100 peaked earlier in the audiovisual than the auditory condition, suggesting facilitatory effects as predicted by the inverse effectiveness principle. Similarly, we observed earlier N100 peaks in response to emotional compared to neutral audiovisual stimuli. This was not the case in the unimodal auditory condition. Furthermore, suppression of beta–band oscillations (15–25 Hz) primarily reflecting biological motion perception was modulated 200–400 ms after the vocalization. While larger differences in suppression between audiovisual and audio stimuli in high compared to low noise levels were found for emotional stimuli, no such difference was observed for neutral stimuli. This observation is in accordance with the inverse effectiveness principle and suggests a modulation of integration by emotional content. Overall, results show that ecologically valid, complex stimuli such as joined body and vocal expressions are effectively integrated very early in processing.

Highlights

  • Body expressions and vocalizations play an important role in communication

  • To assess the impact of visual on auditory processing, we focused on the auditory N100, an event-related potentials (ERPs) component robustly reflecting this impact [30] and indicating facilitated processing in shorter peak latencies [31] and reduced peak amplitudes [30,31]

  • An interaction with the factor emotion (F (2:77,58:20)~16:32, pv:0001, v2~:19) shows that for high a well as low noise levels in the auditory condition, participants performed worst at distinguishing anger from fear (high noise: t(21)~4:96, pv:0001, r~:73, low noise: t(21)~3:38, pv:01, r~:59, both compared to the next-worst distinction)

Read more

Summary

Introduction

Body expressions and vocalizations play an important role in communication. We can readily determine from either modality someone’s gender [1], emotion [2,3], or how familiar a person is [4,5]. While body expressions per definition are biological motion, vocalizations are generated by our vocal tract and strongly influenced by body posture, making them a product of biological motion Both provide closely time–locked and congruent information. Especially in the investigation of audiovisual emotion perception, is the use of mismatch paradigms, in which violation responses can be observed when the two modalities provide conflicting information [11,12]. While these studies suggest an integration of facial and vocal information, they can only indirectly infer integration from incongruent responses. In an ecologically valid context, congruency between modalities is far more common than incongruency

Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.