Abstract
How the brain decomposes and integrates information in multimodal speech perception is linked to oscillatory dynamics. However, how speech takes advantage of redundancy between different sensory modalities, and how this translates into specific oscillatory patterns remains unclear. We address the role of lower beta activity (~20 Hz), generally associated with motor functions, as an amodal central coordinator that receives bottom-up delta-theta copies from specific sensory areas and generate top-down temporal predictions for auditory entrainment. Dissociating temporal prediction from entrainment may explain how and why visual input benefits speech processing rather than adding cognitive load in multimodal speech perception. On the one hand, body movements convey prosodic and syllabic features at delta and theta rates (i.e., 1–3 Hz and 4–7 Hz). On the other hand, the natural precedence of visual input before auditory onsets may prepare the brain to anticipate and facilitate the integration of auditory delta-theta copies of the prosodic-syllabic structure. Here, we identify three fundamental criteria based on recent evidence and hypotheses, which support the notion that lower motor beta frequency may play a central and generic role in temporal prediction during speech perception. First, beta activity must respond to rhythmic stimulation across modalities. Second, beta power must respond to biological motion and speech-related movements conveying temporal information in multimodal speech processing. Third, temporal prediction may recruit a communication loop between motor and primary auditory cortices (PACs) via delta-to-beta cross-frequency coupling. We discuss evidence related to each criterion and extend these concepts to a beta-motivated framework of multimodal speech processing.
Highlights
How the brain decomposes and integrates information in multimodal speech perception is linked to oscillatory dynamics
How does the brain integrate together speech structural information conveyed in two sensory modalities and processed separately in their corresponding cortical areas? In the present review, we discuss the role of lower beta oscillations (∼20 Hz), Lower Beta and Temporal Predictions originating from motor cortex and generally associated with motor functions (Engel and Fries, 2010; Press et al, 2011; Di Nota et al, 2017), as a potential amodal coordinator that integrates structural information of the speech signal extracted in specialized areas via entrainment mechanism
Correct multimodal integration of the continuous speech structure encoded in low-frequency patterns is crucial, as it would allow the brain to benefit from recurrent information across modalities, and in return generate stronger temporal predictions to ease the sensory sampling of incoming input
Summary
Recent research has addressed the role of lower beta power in unimodal stimulus-driven temporal prediction without direct motor engagement. Fujioka et al (2009) investigated lower beta modulations (∼15–20 Hz) with MEG during passive listening to regular auditory tone sequences with random omissions of a single tone. (3) delta-tobeta cross-frequency coupling supports distant communication between (left) motor and auditory cortices when listening to speech Meeting these three criteria, we hypothesize that lower beta plays the role of an amodal central organizer that coordinates afferent copies from visual and auditory areas in multimodal temporal prediction during speech perception. Auditory information transferred from primary to secondary auditory cortex (i.e., left post STG, lpSTG) is qualitatively improved and the multimodal speech integration facilitated It is not clear whether the visual delta-theta afferent copies are integrated first by broader beta motor activity through biological motion perception and sent to the lower beta generic coordinator, or feed it directly. In the latter case, adding visual prosodic structure would not significantly improve speech perception, as the deficit arises later, when auditory and visual prosodic information is integrated with the beta-based coordinator
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.