Abstract

Listener-oriented hyperarticulated clear speech facilitates linguistic processing and cognitive functioning associated with speech perception under various listening conditions. Using the visual-world eye-tracking paradigm, we investigated whether clear speech also aids speech segmentation, or the discovery of word boundaries, and examined the dynamic time course of its effect. Native American English speakers (N = 77) heard sentences in which the target word (e.g., ham) was temporarily ambiguous with a longer unintended competitor (e.g., hamster) across a word boundary (e.g., She saw the ham starting…) while viewing images depicting the target, competitor, and unrelated distractors. Clear and conversational sentences were presented in quiet or in speech-shaped noise at +3 dB signal-to-noise ratio. Analysis of eye fixations to the images over time revealed that compared with conversational speech, clear speech facilitated the disambiguation of the target from the competitor even before the disambiguation point was reached. The facilitation was found in both listening conditions but was relatively delayed in noise. These findings suggest that speaking clearly improves word segmentation and reduces lexical competition especially in optimal listening conditions. The speech segmentation facilitation may partly underlie the clear speech benefits observed for other signal-dependent and relatively signal-independent linguistic and cognitive processes.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call