Abstract

It is known that the rhythms of speech are visible on the face, accurately mirroring changes in the vocal tract. These low-frequency visual temporal movements are tightly correlated with speech output, and both visual speech (e.g., mouth motion) and the acoustic speech amplitude envelope entrain neural oscillations. Low-frequency visual temporal information ('visual prosody') is known from behavioural studies to be perceived by infants, but oscillatory studies are currently lacking. Here we measure cortical tracking of low-frequency visual temporal information by 5- and 8-month-old infants using a rhythmic speech paradigm (repetition of the syllable 'ta' at 2Hz). Eye-tracking data were collected simultaneously with EEG, enabling computation of cortical tracking and phase angle during visual-only speech presentation. Significantly higher power at the stimulus frequency indicated that cortical tracking occurred across both ages. Further, individual differences in preferred phase to visual speech related to subsequent measures of language acquisition. The difference in phase between visual-only speech and the same speech presented as auditory-visual at 6- and 9-months was also examined. These neural data suggest that individual differences in early language acquisition may be related to the phase of entrainment to visual rhythmic input in infancy. RESEARCH HIGHLIGHTS: Infant preferred phase to visual rhythmic speech predicts language outcomes. Significant cortical tracking of visual speech is present at 5 and 8months. Phase angle to visual speech at 8months predicted greater receptive and productive vocabulary at 24months.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call