Abstract

It is clear that humans can extract statistical information from streams of visual input, yet how our brain processes sequential images into the abstract representation of the mean feature value remains poorly explored. Using multivariate pattern analyses of electroencephalography recorded while human observers viewed 10 sequentially presented Gabors of different orientations to estimate their mean orientation at the end, we investigated sequential averaging mechanism by tracking the quality of individual and mean orientation as a function of sequential position. Critically, we varied the sequential variance of Gabor orientations to understand the neural basis of perceptual mean errors occurring during a sequential averaging task. We found that the mean-orientation representation emerged at specific delays from each sequential stimulus onset and became increasingly accurate as additional Gabors were viewed. Especially in frontocentral electrodes, the neural representation of mean orientation improved more rapidly and to a greater degree in less volatile environments, whereas individual orientation information was encoded precisely regardless of environmental volatility. The computational analysis of behavioral data also showed that perceptual mean errors arise from the cumulative construction of the mean orientation rather than the low-level encoding of individual stimulus orientation. Thus, our findings provide neural mechanisms to differentially accumulate increasingly abstract features from a concrete piece of information across the cortical hierarchy depending on environmental volatility.SIGNIFICANCE STATEMENT The visual system extracts behaviorally relevant summary statistical representation by exploiting statistical regularity of the visual stream over time. However, how the neural representation of the abstract mean feature value develops in a temporally changing environment remains poorly identified. Here, we directly recover the mean orientation information of sequentially delivered Gabor stimuli with different orientations as a function of their positions in time. The mean orientation representation, which is regularly updated, becomes increasingly accurate with increasing sequential position especially in the frontocentral region. Further, perceptual mean errors arise from the cumulative process rather than the low-level stimulus encoding. Overall, our study reveals a role of higher cortical areas in integrating stimulus-specific information into increasingly abstract task-oriented information.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call