Abstract

Neural oscillations subserve a broad range of speech processing and language comprehension functions. Using an electroencephalogram (EEG), we investigated the frequency-specific directed interactions between whole-brain regions while the participants processed Chinese sentences using different modality stimuli (i.e., auditory, visual, and audio-visual). The results indicate that low-frequency responses correspond to the process of information flow aggregation in primary sensory cortices in different modalities. Information flow dominated by high-frequency responses exhibited characteristics of bottom-up flow from left posterior temporal to left frontal regions. The network pattern of top-down information flowing out of the left frontal lobe was presented by the joint dominance of low- and high-frequency rhythms. Overall, our results suggest that the brain may be modality-independent when processing higher-order language information.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call