Abstract

Seeing the articulatory gestures of the speaker (“speech reading”) enhances speech perception especially in noisy conditions. Recent neuroimaging studies tentatively suggest that speech reading activates speech motor system, which then influences superior-posterior temporal lobe auditory areas via an efference copy. Here, nineteen healthy volunteers were presented with silent videoclips of a person articulating Finnish vowels /a/, /i/ (non-targets), and /o/ (targets) during event-related functional magnetic resonance imaging (fMRI). Speech reading significantly activated visual cortex, posterior fusiform gyrus (pFG), posterior superior temporal gyrus and sulcus (pSTG/S), and the speech motor areas, including premotor cortex, parts of the inferior (IFG) and middle (MFG) frontal gyri extending into frontal polar (FP) structures, somatosensory areas, and supramarginal gyrus (SMG). Structural equation modelling (SEM) of these data suggested that information flows first from extrastriate visual cortex to pFS, and from there, in parallel, to pSTG/S and MFG/FP. From pSTG/S information flow continues to IFG or SMG and eventually somatosensory areas. Feedback connectivity was estimated to run from MFG/FP to IFG, and pSTG/S. The direct functional connection from pFG to MFG/FP and feedback connection from MFG/FP to pSTG/S and IFG support the hypothesis of prefrontal speech motor areas influencing auditory speech processing in pSTG/S via an efference copy.

Highlights

  • Speech perception is not limited to hearing, as seeing the articulatory gestures of a speaker, the lip forms, position of the jaw and the tongue, significantly enhances speech perception especially in noisy conditions [1,2]

  • As a phenomenon demonstrating that visual information has access to the auditory system at relatively early sound processing stages, presentation of certain combinations of incongruent phonetic sounds and articulatory gestures can result in illusory third-category phonetic percepts, for instance, visual /ga/ and auditory /ba/ often results in the perception of /da/ [3], especially when the auditory stimulus is degraded or presented in noise [1]

  • In the target condition /o/, significant parietal cortical activity was observed in the secondary somatosensory cortex (BA 2) and in the supramarginal gyrus (SMG) (BA 40)

Read more

Summary

Introduction

Speech perception is not limited to hearing, as seeing the articulatory gestures of a speaker, the lip forms, position of the jaw and the tongue, significantly enhances speech perception especially in noisy conditions [1,2]. There are a number of previous functional magnetic resonance imaging (fMRI) studies that have mapped brain areas that participate in processing of visual speech (i.e., ‘‘speech reading’’) and/or in which brain areas speech reading influences auditory speech processing. These studies have suggested that auditory processing is robustly modulated especially in the left hemisphere posterior superior temporal gyrus/sulcus (pSTG/S) [4,5,6,7,8,9,10,11,12,13,14,15,16,17]. The superior/ posterior aspects of the temporal lobe, which seem to be the site of audiovisual interactions, have been hypothesized to contain representations mapping ‘‘doable’’ articulations with associated sounds [31], in lieu of the motor theory of speech perception [32,33]

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call