The ability to make accurate and timely decisions, such as judging when it is safe to cross the road, is the foundation of adaptive behavior. While the computational and neural processes supporting simple decisions on isolated stimuli have been well characterized, decision-making in the real world often requires integration of discrete sensory events over time and space. Most previous experimental work on perceptual decision-making has focused on tasks that involve only a single, task-relevant source of sensory input. It remains unclear, therefore, how such integrative decisions are regulated computationally. Here we used psychophysics, electroencephalography, and computational modeling to understand how the human brain combines visual motion signals across space in the service of a single, integrated decision. To that purpose, we presented two random-dot kinematograms in the left and the right visual hemifields. Coherent motion signals were shown briefly and concurrently in each location, and healthy adult human participants of both sexes reported the average of the two motion signals. We directly tested competing predictions arising from influential serial and parallel accounts of visual processing. Using a biologically plausible model of motion filtering, we found evidence in favor of parallel integration as the fundamental computational mechanism regulating integrated perceptual decisions.
Read full abstract