Abstract

Speech production gives rise to distinct auditory and somatosensory feedback signals which are dynamically integrated to enable online monitoring and error correction, though it remains unclear how the sensorimotor system supports the integration of these multimodal signals. Capitalizing on the parity of sensorimotor processes supporting perception and production, the current study employed the McGurk paradigm to induce multimodal sensory congruence/incongruence. EEG data from a cohort of 39 typical speakers were decomposed with independent component analysis to identify bilateral mu rhythms; indices of sensorimotor activity. Subsequent time-frequency analyses revealed bilateral patterns of event related desynchronization (ERD) across alpha and beta frequency ranges over the time course of perceptual events. Right mu activity was characterized by reduced ERD during all cases of audiovisual incongruence, while left mu activity was attenuated and protracted in McGurk trials eliciting sensory fusion. Results were interpreted to suggest distinct hemispheric contributions, with right hemisphere mu activity supporting a coarse incongruence detection process and left hemisphere mu activity reflecting a more granular level of analysis including phonological identification and incongruence resolution. Findings are also considered in regard to incongruence detection and resolution processes during production.

Highlights

  • During speech production, sensory feedback is integrated into feedforward motor commands to enable online error detection and fluent coarticulation at normal speech rates [1]

  • These notions have been explicitly outlined in computational models of speech production such as Directions Into Velocities of Articulator (DIVA; [2, 3]) and State Feedback Control (SFC; [4]), with their assertions well supported by the results of auditory perturbation studies demonstrating online adaptations to vocal output as a function of unexpected perturbations to auditory reafference [5, 6]

  • A similar functional dissociation has been observed in the visual domain, with the right hemisphere detecting anomalies/incongruence and the left hemisphere engaging in further processing to resolve detected incongruities [126,127,128], and the results of the current study suggest that a similar hemispheric dissociation is present in the sensorimotor system

Read more

Summary

Introduction

Sensory feedback is integrated into feedforward motor commands to enable online error detection and fluent coarticulation at normal speech rates [1]. A recent study by Smith et al [9] probed the influence of convergent and divergent feedback signals on speech motor control, demonstrating dynamic patterns of adaptive behavior in response to unimodal and multimodal sensory feedback perturbations. It remains unclear how these multimodal feedback signals are integrated in the brain and how these integration processes influence sensorimotor activity supporting speech processing [10, 11]. The goal of the current study is to clarify how the integration of convergent and divergent sensory streams influences sensorimotor activity

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call