Abstract

Behavioural, neuroimaging and lesion studies show that face processing has a special role in human perception. The purpose of this EEG study was to explore whether auditory information influences visual face perception. We employed a 2×2 factorial design and presented subjects with visual stimuli that could be cartoon faces or scrambled faces where size changes of one of the components, the mouth in the face condition, was either congruent or incongruent with the amplitude modulation of a simultaneously presented auditory signal. Our data show a significant main effect for signal congruence at an ERP peak around 135ms and a significant main effect of face configuration at around 200ms. The timing and scalp topology of both effects corresponds well to previously reported data on the integration of non-redundant audio-visual stimuli and face-selective processing. Our analysis did not show any significant statistical interactions. This double disassociation suggests that the early component, at 135ms, is sensitive to auditory-visual congruency but not to facial configuration and that the later component is sensitive to facial configuration but not to AV congruency. We conclude that facial configurational processing is not influenced by the congruence of simultaneous auditory signals and is independent from featural processing where we see evidence for multisensory integration.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.