Abstract

Speech and gesture are two integrated and temporally coordinated systems. Manual gestures can help second language (L2) speakers with vocabulary learning and word retrieval. However, it is still under-investigated whether the synchronisation of speech and gesture has a role in helping listeners compensate for the difficulties in processing L2 aural information. In this paper, we tested, in two behavioural experiments, how L2 speakers process speech and gesture asynchronies in comparison to native speakers (L1). L2 speakers responded significantly faster when gestures and the semantic relevant speech were synchronous than asynchronous. They responded significantly slower than L1 speakers regardless of speech/gesture synchronisation. On the other hand, L1 speakers did not show a significant difference between asynchronous and synchronous integration of gestures and speech. We conclude that gesture-speech asynchrony affects L2 speakers more than L1 speakers.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call