Abstract

It is commonly understood that hand gesture and speech coordination in humans is culturally and cognitively acquired, rather than having a biological basis. Recently, however, the biomechanical physical coupling of arm movements to speech vocalization has been studied in steady‐state vocalization and monosyllabic utterances, where forces produced during gesturing are transferred onto the tensioned body, leading to changes in respiratory‐related activity and thereby affecting vocalization F0 and intensity. In the current experiment (n = 37), we extend this previous line of work to show that gesture–speech physics also impacts fluent speech. Compared with nonmovement, participants who are producing fluent self‐formulated speech while rhythmically moving their limbs demonstrate heightened F0 and amplitude envelope, and such effects are more pronounced for higher‐impulse arm versus lower‐impulse wrist movement. We replicate that acoustic peaks arise especially during moments of peak impulse (i.e., the beat) of the movement, namely around deceleration phases of the movement. Finally, higher deceleration rates of higher‐mass arm movements were related to higher peaks in acoustics. These results confirm a role for physical impulses of gesture affecting the speech system. We discuss the implications of gesture–speech physics for understanding of the emergence of communicative gesture, both ontogenetically and phylogenetically.

Highlights

  • Communicative hand gestures are ubiquitous across human cultures

  • Overview of analyses We report three main analyses to show that gesture–speech physics is present in fluent speech

  • We assess the overall effects of movement condition on vocalization acoustics (F0 and the amplitude envelope); these would support our hypothesis that upper limb movement—especially high-impulse movement—constrains fluent speech acoustics

Read more

Summary

Introduction

Communicative hand gestures are ubiquitous across human cultures. Gestures aid communication by seamlessly interweaving relevant pragmatic, iconic, and symbolic expressions of the hands together with speech.[1,2,3] For such multiarticulatory utterances to do their communicative work, gesture and speech must be tightly temporally coordinated to form a sensible speech–gesture whole. The salient moments of gestures are often timed with emphatic stress made in speech, no matter what the hands depict.[4,5] For such gesture–speech coordination to get off the ground, the system must functionally constrain its degrees of freedom;[6] in doing so, it will have to utilize (or otherwise account for) intrinsic dynamics arising from the biophysics of speaking and moving at the same time. We provide evidence that movement of the upper limbs constrains fluent self-generated speech acoustics through biomechanics

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call