Abstract
Speech contains rich acoustic and linguistic information. Using highly controlled speech materials, previous studies have demonstrated that cortical activity is synchronous to the rhythms of perceived linguistic units, for example, words and phrases, on top of basic acoustic features, for example, the speech envelope. When listening to natural speech, it remains unclear, however, how cortical activity jointly encodes acoustic and linguistic information. Here we investigate the neural encoding of words using electroencephalography and observe neural activity synchronous to multi-syllabic words when participants naturally listen to narratives. An amplitude modulation (AM) cue for word rhythm enhances the word-level response, but the effect is only observed during passive listening. Furthermore, words and the AM cue are encoded by spatially separable neural responses that are differentially modulated by attention. These results suggest that bottom-up acoustic cues and top-down linguistic knowledge separately contribute to cortical encoding of linguistic units in spoken narratives.
Highlights
When listening to speech, low-frequency cortical activity in the delta (
When participants naturally listen to spoken narratives, we observe that cortical activity is synchronous to the rhythm of spoken words
The word synchronous response is observed whether participants listen to natural speech or synthesized isochronous speech that removes word-related acoustic cues
Summary
Low-frequency cortical activity in the delta (
Published Version (
Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have