Abstract
Human perception of ambiguous sensory signals is biased by prior experiences. It is not known how such prior information is encoded, retrieved and combined with sensory information by neurons. Previous authors have suggested dynamic encoding mechanisms for prior information, whereby top-down modulation of firing patterns on a trial-by-trial basis creates short-term representations of priors. Although such a mechanism may well account for perceptual bias arising in the short-term, it does not account for the often irreversible and robust changes in perception that result from long-term, developmental experience. Based on the finding that more frequently experienced stimuli gain greater representations in sensory cortices during development, we reasoned that prior information could be stored in the size of cortical sensory representations. For the case of auditory perception, we use a computational model to show that prior information about sound frequency distributions may be stored in the size of primary auditory cortex frequency representations, read-out by elevated baseline activity in all neurons and combined with sensory-evoked activity to generate a percept that conforms to Bayesian integration theory. Our results suggest an alternative neural mechanism for experience-induced long-term perceptual bias in the context of auditory perception. They make the testable prediction that the extent of such perceptual prior bias is modulated by both the degree of cortical reorganization and the magnitude of spontaneous activity in primary auditory cortex. Given that cortical over-representation of frequently experienced stimuli, as well as perceptual bias towards such stimuli is a common phenomenon across sensory modalities, our model may generalize to sensory perception, rather than being specific to auditory perception.
Highlights
Natural stimuli are variable and often mixed with noise
The results indicate that prior information stored in primary auditory cortex frequency representations can be read-out by locally generated neuronal activity and combined with sensory-evoked activity to generate a percept that conforms to Bayesian integration theory
The maximumlikelihood estimate of sensory input from population responses is insensitive to inhomogeneity of sensory representations, and always converges on the input stimulus
Summary
Natural stimuli are variable and often mixed with noise. Our perception of these stimuli is derived from ambiguous sensory inputs. Psychophysical experiments in humans and primates indicate that this ambiguity is partly compensated for by incorporating information about the probabilities of previously experienced stimuli directly into the percept in a Bayesian manner [1,2,3] It is not known how this prior information is encoded, retrieved and combined with sensory information by neurons [4,5]. Previous theoretical investigations of Bayesian inference were often based on homogeneous stimulus representations—i.e., all possible values of stimulus parameters are evenly represented [5] In such a representational system, prior information is typically modeled as the activation of a sub-population of neurons by topdown influences or cross-modal interactions [5,6]. These prior storage and integration processes are believed to occur in higher-level/multisensory cortical areas, but not in low-level sensory cortices
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.