Abstract

In natural listening situations, understanding spoken sentences requires interactions between several multisensory to linguistic levels of information. In two electroencephalographical studies, we examined the neuronal oscillations of linguistic prediction produced by unimodal and bimodal sentence listening to observe how these brain correlates were affected by the sensory streams delivering linguistic information. Sentence contexts which were strongly predictive of a particular word were ended by a possessive adjective matching or not the gender of the predicted word. Alpha, beta and gamma oscillations were investigated as they were considered to play a crucial role in the predictive process. During the audiovisual or auditory-only listening to sentences, no evidence of word prediction was observed. In contrast, in a more challenging listening situation during which bimodal audiovisual streams switched to unimodal auditory stream, gamma power was sensitive to word prediction based on prior sentence context. Results suggest that prediction spreading from higher sentence levels to lower word levels is optional during unimodal and bimodal sentence listening and is observed when the listening situation is more challenging. Alpha and beta oscillations were found to decrease when semantically constraining sentences were delivered in the audiovisual modality in comparison with the auditory-only modality. Altogether, our findings bear major implications for our understanding of the neural mechanisms that support predictive processing in multimodal language comprehension.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call