Abstract

Speech perception often involves multisensory processing. Although previous studies have demonstrated visual [1, 2] and somatosensory interactions [3, 4] with auditory processing, it is not clear whether somatosensory information can contribute to the processing of audiovisual speech perception. This study explored the neural consequence of somatosensory interactions in audiovisual speech processing. We assessed whether somatosensory orofacial stimulation influenced event-related potentials (ERPs) in response to an audiovisual speech illusion (the McGurk Effect [1]). 64 scalp sites of ERPs were recorded in response to audiovisual speech stimulation and somatosensory stimulation. In the audiovisual condition, an auditory stimulus /ba/ was synchronized with the video of congruent facial motion (the production of /ba/) or incongruent facial motion (the production of the /da/: McGurk condition). These two audiovisual stimulations were randomly presented with and without somatosensory stimulation associated with facial skin deformation. We found ERPs differences associated with the McGurk effect in the presence of the somatosensory conditions. ERPs for the McGurk effect reliably diverge around 280 ms after auditory onset. The results demonstrate a change of cortical potential of audiovisual processing due to somatosensory inputs and suggest that somatosensory information encoding facial motion also influences speech processing.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.