Abstract

Spoken language contains few reliable acoustic cues to word boundaries, yet listeners readily perceive words as separated in continuous speech. Dilley and Pitt (2010) showed that the rate of nonlocal (i.e., distal) context speech influences word segmentation, but present theories of word segmentation cannot account for whether and how this cue interacts with other acoustic cues proximal to (i.e., in the vicinity of) the word boundary. Four experiments examined the interaction of distal speech rate with four proximal acoustic cues that have been shown to influence segmentation: intensity (Experiment 1), fundamental frequency (Experiment 2), word duration (Experiment 3), and high frequency noise resembling a consonantal onset (Experiment 4). Participants listened to sentence fragments and indicated which of two lexical interpretations they heard, where one interpretation contained more words than the other. Across all four experiments, both distal speech rate and proximal acoustic manipulations affected the reported lexical interpretation, but the two types of cues did not consistently interact. Overall, the results of the set of experiments are inconsistent with a strictly-ranked hierarchy of cues to word boundaries, and instead highlight the necessity of word segmentation and lexical access theories to allow for flexible rankings of cues to word boundary placement.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call