Abstract

Theories of vocal signalling in humans typically only consider communication within the interactive group and ignore intergroup dynamics. Recent work has found that colaughter generated between pairs of people in conversation can afford accurate judgements of affiliation across widely disparate cultures, and the acoustic features that listeners use to make these judgements are linked to speaker arousal. But to what extent does colaughter inform third party listeners beyond other dynamic information between interlocutors such as overlapping talk? We presented listeners with short segments (1–3 s) of colaughter and simultaneous speech (i.e. cospeech) taken from natural conversations between established friends and newly acquainted strangers. Participants judged whether the pairs of interactants in the segments were friends or strangers. Colaughter afforded more accurate judgements of affiliation than did cospeech, despite cospeech being over twice in duration relative to colaughter on average. Sped-up versions of colaughter and cospeech (proxies of speaker arousal) did not improve accuracy for either identifying friends or strangers, but faster versions of both modes increased the likelihood of tokens being judged as being between friends. Overall, results are consistent with research showing that laughter is well suited to transmit rich information about social relationships to third party overhearers—a signal that works between, and not just within conversational groups.

Highlights

  • During social interactions, people produce a variety of dynamic behaviours that serve to function within an interacting group [1], but can inform third parties about the nature of their social relationships and their intentions in a broad sense

  • Participants were able to judge colaughter and cospeech above chance (AUC greater than 0.5, table 1 and figure 4), but were credibly better at judging colaughter—the difference in sensitivity between conditions was 0.37, 95% credibility intervals (CI): 0.24 0.51, 100% credibility; that is, 100% of the estimated parameter values indicated a higher sensitivity for colaughter

  • We observed that participants were more likely to judge colaughter as produced by friends (62%) than cospeech (47.5%), a pattern that held for stimuli produced by strangers only, with a false positive rate of 41% for cospeech and 49% for colaughter

Read more

Summary

Introduction

People produce a variety of dynamic behaviours that serve to function within an interacting group [1], but can inform third parties about the nature of their social relationships and their intentions in a broad sense. We used cospeech here to act as an appropriate baseline behaviour in conversational interaction rather than a viable alternative as a potential signal of affiliative status between speakers Emotional signals such as colaughter contain rich information about the mutual affective intentions between socially interacting individuals, including honest signals of the physiological states of the interactants. Colaughter between friends is more likely to be judged as containing individual spontaneous laughter [9,18], a product of the evolutionarily conserved vocal emotion system [55,56,57] These signals can reveal social information in a way that ordinary speech typically does not. While arousal can have perceptible effects on speech rate [58], there are many non-affective reasons that people alter their speech rate, and intra- and interspeaker variability are high

Method
Participants
Laughter and speech stimuli
Procedure
Statistical modelling
Results
Discussion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.