Abstract

Recent research in psychology and neuroscience has demonstrated that co-speech gestures are semantically integrated with speech during language comprehension and development. The present study explored whether gestures also play a role in language learning in adults. In Experiment 1, we exposed adults to a brief training session presenting novel Japanese verbs with and without hand gestures. Three sets of memory tests (at five minutes, two days and one week) showed that the greatest word learning occurred when gestures conveyed redundant imagistic information to speech. Experiment 2 was a preliminary investigation into possible neural correlates for such learning. We exposed participants to similar training sessions over three days and then measured event-related potentials (ERPs) to words learned with and without co-speech gestures. The main finding was that words learned with gesture produced a larger Late Positive Complex (indexing recollection) in bi-lateral parietal sites than words learned without gesture. However, there was no significant difference between the two conditions for the N400 component (indexing familiarity). The results have implications for pedagogical practices in foreign language instruction and theories of gesture-speech integration.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call