Abstract

Visual speech information can influence the perception of acoustic speech. In acoustically noisy environments, people can often understand more words correctly if they can see a talker as well as hear them. Additionally, conflicting visual speech information can influence the perception of acoustic speech (namely, the McGurk effect), causing a percept of a sound that was not present in actual acoustic speech. This auditory and visual speech information does not need to be perfectly synchronous in order to be integrated. Rather, there is a “synchrony window” over which this information can be integrated. The extent to which a distracting cognitive load task affects the influence of the visual speech is still not well understood. It is also unknown whether a distracting cognitive task has an influence on the integration of temporally asynchronous speech. A series of experiments using both speech-in-noise and McGurk tasks with concurrent working memory tasks was used to address this question. The temporal offset in some of the McGurk tasks was also manipulated. Overall, results suggest that while some interference of the cognitive task can be observed, this influence is quite small and does not have a substantial influence on the integration of asynchronous speech.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call