Abstract

Background The fascinating ability of brain to integrate information from multiple sensory inputs has intrigued many researchers. Audio–visual (AV) interaction is a form of multisensory integration which we encounter to form meaningful representations of the environment around us. There is limited literature related to the underlying neural mechanisms. Purpose Quantitative EEG (QEEG), a tool with high temporal resolution can be used to understand cortical sources of AV interactions. Methods EEG data was recorded using 128 channels from 30 healthy subjects using audio, visual and AV stimuli in ‘object detection task’. Electrical source imaging was performed using s-LORETA across seven frequency bands (lower alpha 1, lower alpha 2, upper alpha, beta, delta, gamma, theta) during AV versus unimodal conditions across 66 gyri. Results The cortical sources were activated in the theta, beta, and gamma bands in cross modal versus unimodal conditions, which we propose, reflect neural communication for AV interaction network. The cortical sources constituted areas involved with visual processing, auditory processing, established multisensory (frontotemporal cortex, parietal cortex, middle temporal gyrus, superior frontal gyrus, inferior frontal gyrus, precentral gyrus) and potential multisensory areas (paracentral, postcentral and subcallosal). Conclusion Together, these results offer an integrative view of cortical areas in frequency oscillations during AV interactions.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.