Affective voice signaling has significant biological and social relevance across various species, and different affective signaling types have emerged through the evolution of voice communication. These types range from basic affective voice bursts and nonverbal affective up to affective intonations superimposed on speech utterances in humans in the form of paraverbal prosodic patterns. These different types of affective signaling should have evolved to be acoustically and perceptually distinctive, allowing accurate and nuanced affective communication. It might be assumed that affect signaling is most effective and distinctive in affective prosody as the presumably most recently evolved form of acoustic voice signaling. We investigated and compared two signaling types in human voice communication with different evolutionary backgrounds, referred to as nonverbal affect signals (shared across many species) and affective prosody (being exclusive in humans). We found, first, that various basic affect categories seem to be distinctively encoded in both signal types, but there seems minimal continuity in the acoustic code from nonverbal affect signals to affective prosody and vice versa. Second, we found that decoding affective meaning seems considerably impaired from affective prosody. Many positive affect signals and especially vocal disgust showed extreme decoding impairments from affective prosody, with speech acoustics probably constraining affect encoding in prosody to a considerable degree. Only the recognizability of voice signals of threat seems to be largely preserved in affective prosody. In conclusion, it points to considerable discontinuities between nonverbal and paraverbal affect signals, which questions the evolutionary precursors of human affect signaling in voice communication. (PsycInfo Database Record (c) 2025 APA, all rights reserved).
Read full abstract