Abstract

Visual speech cues maximize speech perception for adults in challenging listening environments: reliance on visual speech increases with greater auditory degradation. Young children, however, may be limited in audiovisual speech perception because of the cognitive costs associated with multimodal processing. In this study, real-time speech understanding was measured in preschool children (N = 33, 30–48 months of age) using eye-tracking methodology in the absence (quiet) or presence of two-talker babble and in the absence (auditory-alone, AO) or presence of audiovisual (AV) speech cues. On each trial, children were instructed by a female speaker to look at one of two objects projected onto a large screen. Speech processing was quantified by how quickly children fixated the target object (reaction time, RT) and overall accuracy of target-object fixation. Visual benefit was calculated as the difference in performance between the AO and AV conditions. Analyses revealed negative correlations between RTAO and RTAV in both the quiet and two-talker babble conditions: visual speech facilitated speech processing in children with slow RTAO, but not in children with fast RTAO. Results suggest that visual speech facilitates speech perception in preschool children, but is likely dependent on children’s processing efficiency of the AO speech signal.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call