Abstract

Nonverbal expressions of emotion can vary in intensity, from ambiguous to prototypical exemplars: for instance, facial displays of happiness may range from a faint smile to a full-blown grin. Previous work suggests that the accuracy with which facial expressions are recognized as the intended emotion increases with emotional intensity, although this pattern depends on the displayed emotion. Less is known about the association between emotional intensity and the recognition of vocal emotional expressions (affective prosody), which also convey information about others' socioemotional intent but are perceived and interpreted differently than facial expressions. The current study examined listeners' ability to recognize emotional intent in morphed vocal prosody recordings that varied in emotional intensity from neutral to prototypical exemplars of basic emotions (anger, disgust, fear, happiness, sadness) and social expressions (friendliness, meanness). Results suggest that listeners' accuracy in identifying the intended emotional intent in each recording increased nonlinearly with emotional intensity. This pattern varied by emotion type: for instance, accuracy for anger rose steeply with increasing emotional intensity before plateauing, whereas accuracy for happiness remained unchanged across low-intensity exemplars but increased thereafter. These findings highlight emotion-specific ways in which dynamic changes in emotional intensity inform perceptions of socioemotional intent in emotional prosody. Moreover, these results also point to potential challenges in emotional communication in social interactions that rely primarily on the voice, with many low-intensity expressions having a higher probability of being misinterpreted. (PsycInfo Database Record (c) 2021 APA, all rights reserved).

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call