Abstract

Speech acoustics and body movements are systematically linked during speaking; however, there is no consensus regarding which acoustic and/or kinematic signals are relevant for this coordination nor what critical landmarks facilitate this linkage. To address this, we use correlation map analysis (CMA) of speech acoustics and body movement to assess the hypothesis that speech and gesture are coordinated at prosodically prominent speech regions. CMA allows for the analysis of correlation between any two continuous signals both instantaneously and at user-defined ranges of delay. In this study, we examine the relationship of the speech’s RMS acoustic amplitude signal with the dominant hand’s velocity signal at speech turns, phrase boundaries, acoustic amplitude peaks, and other (elsewhere) regions. We find that turns and phrase edges exhibit the greatest likelihood of positive correlation. Additionally, the likelihood of correlation is higher when the hand velocity signal is delayed with respect to the amplitude signal. These results suggest that speech and gesture may be strongly linked at speech turn landmarks for the purpose of signaling a floor exchange. Thus in addition to coordination for semantic and prosodic purposes, we propose that speech-gesture coordination can serve with prosody to signal speech turns in conversation. [Work supported by NIH.]

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call