Abstract

Humans may produce wrong judgments only using visual information obtained directly by naked eyes when judging some indistinct targets with long distances in a complex environment. So we usually use the information obtained and integrated by two sensory channels of visual and auditory to perceive the environment and make judgments. We aimed to study the effect of auditory stimuli in audio-visual two-source integration and verify the effectiveness of machine assist. In this paper a simple face-vs.-car picture discrimination experiment was designed and 20 healthy subjects were recruited. The effect of auditory stimuli in audio-visual integration was analyzed from the three aspects of behavioral, Event related Potentials (ERP) and Brain Electrical Activity Mapping (BEAM). The behavioral results proved that adding auditory stimuli can increase the accuracy of judgments and shorten reaction time. The ERP showed that an auditory-related potential was evoked at 100ms and the decision-related potential was evoked at 300-500ms after the auditory stimuli presented. Compared with visual stimuli, the auditor-visual stimuli decreased the decision-related potential. The BEAM suggested that the auditor-visual stimuli activated the junction area of parietal lobe and temporal lobe when the visual and auditory information was integrated.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call