Abstract

An important step in perceptual processing is the integration of information from different sensory modalities into a coherent percept. It has been suggested that such crossmodal binding might be achieved by transient synchronization of neurons from different modalities in the gamma-frequency range (>30 Hz). Here we employed a crossmodal priming paradigm, modulating the semantic congruency between visual–auditory natural object stimulus pairs, during the recording of the high density electroencephalogram (EEG). Subjects performed a semantic categorization task. Analysis of the behavioral data showed a crossmodal priming effect (facilitated auditory object recognition) in response to semantically congruent stimuli. Differences in event-related potentials (ERP) were found between 250 and 350 ms, which were localized to left middle temporal gyrus (BA 21) using a distributed linear source model. Early gamma-band activity (40–50 Hz) was increased between 120 ms and 180 ms following auditory stimulus onset for semantically congruent stimulus pairs. Source reconstruction for this gamma-band response revealed a maximal increase in left middle temporal gyrus (BA 21), an area known to be related to the processing of both complex auditory stimuli and multisensory processing. The data support the hypothesis that oscillatory activity in the gamma-band reflects crossmodal semantic-matching processes in multisensory convergence sites.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.