Abstract
Seeing a talker’s face can aid audiovisual (AV) integration when speech is presented in noise. However, few studies have simultaneously manipulated auditory and visual degradation. We aimed to establish how degrading the auditory and visual signal affected AV integration. Where people look on the face in this context is also of interest; Buchan, Paré and Munhall (Brain Research, 1242, 162–171, 2008) found fixations on the mouth increased in the presence of auditory noise whilst Wilson, Alsius, Paré and Munhall (Journal of Speech, Language, and Hearing Research, 59(4), 601–615, 2016) found mouth fixations decreased with decreasing visual resolution. In Condition 1, participants listened to clear speech, and in Condition 2, participants listened to vocoded speech designed to simulate the information provided by a cochlear implant. Speech was presented in three levels of auditory noise and three levels of visual blurring. Adding noise to the auditory signal increased McGurk responses, while blurring the visual signal decreased McGurk responses. Participants fixated the mouth more on trials when the McGurk effect was perceived. Adding auditory noise led to people fixating the mouth more, while visual degradation led to people fixating the mouth less. Combined, the results suggest that modality preference and where people look during AV integration of incongruent syllables varies according to the quality of information available.
Highlights
In our everyday environment we are bombarded with information from our senses; multisensory integration is essential for helping to consolidate information and make sense of the world
Six participants were excluded after data collection and before analyses were conducted, four due to incomplete eye movement data, one because of a diagnosis of attention-deficit hyperactivity disorder (ADHD) and one because English was not their first language
We investigated how perception of the McGurk effect and accompanying eye movements were affected when speech was presented in auditory noise and visual blur
Summary
In our everyday environment we are bombarded with information from our senses; multisensory integration is essential for helping to consolidate information and make sense of the world. Multisensory information is often complementary; for example, to understand the person speaking during a conservation, the auditory element (the voice of the speaker) and the visual element (the face of the speaker) are combined into a single percept. Sensory pathways in the brain are cross-modal, meaning they can be influenced by other modalities (Shimojo & Shams, 2001). This idea is underpinned in part by evidence from audiovisual perceptual illusions that arise when synchronized, incongruent information is presented in the auditory and visual modalities. Research has shown that auditory stimuli can influence visual perception, as demonstrated in the soundinduced flash illusion in which viewers perceive a unitary flash as a double flash if it coincides with two auditory beeps (Shams, Kamitani & Shimojo, 2000). Two flashes can be perceived as a single flash if a single beep is presented; this is termed the fusion effect (Andersen, Tiippana & Sams, 2004)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.