Abstract

The ability to identify emotions from the human voice is a crucial aspect of social cognition. Currently, very little is known about the neural correlates of nonverbal emotional vocalizations processing. We used electrophysiological measures to examine the processing of emotional versus neutral vocalizations. Participants listened to nonverbal angry, happy, and neutral vocalizations, as well as to monkey voices, which served as a response target. Angry sounds were processed differently than happy and neutral ones starting at 50 ms, whereas both vocal emotions were associated with decreased N100 and increased P200 components relative to neutral sounds. These findings indicate a rapid and automatic differentiation of emotional as compared with neutral vocalizations and suggest that this differentiation is not dependent on valence.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.