Abstract

The ability to make accurate and timely decisions, such as judging when it is safe to cross the road, is the foundation of adaptive behavior. While the computational and neural processes supporting simple decisions on isolated stimuli have been well characterized, decision-making in the real world often requires integration of discrete sensory events over time and space. Most previous experimental work on perceptual decision-making has focused on tasks that involve only a single, task-relevant source of sensory input. It remains unclear, therefore, how such integrative decisions are regulated computationally. Here we used psychophysics, electroencephalography, and computational modeling to understand how the human brain combines visual motion signals across space in the service of a single, integrated decision. To that purpose, we presented two random-dot kinematograms in the left and the right visual hemifields. Coherent motion signals were shown briefly and concurrently in each location, and healthy adult human participants of both sexes reported the average of the two motion signals. We directly tested competing predictions arising from influential serial and parallel accounts of visual processing. Using a biologically plausible model of motion filtering, we found evidence in favor of parallel integration as the fundamental computational mechanism regulating integrated perceptual decisions.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.