Abstract
Early visual cortex receives non-feedforward input from lateral and top-down connections (Muckli & Petro 2013 Curr. Opin. Neurobiol. 23, 195–201. (doi:10.1016/j.conb.2013.01.020)), including long-range projections from auditory areas. Early visual cortex can code for high-level auditory information, with neural patterns representing natural sound stimulation (Vetter et al. 2014 Curr. Biol. 24, 1256–1262. (doi:10.1016/j.cub.2014.04.020)). We discuss a number of questions arising from these findings. What is the adaptive function of bimodal representations in visual cortex? What type of information projects from auditory to visual cortex? What are the anatomical constraints of auditory information in V1, for example, periphery versus fovea, superficial versus deep cortical layers? Is there a putative neural mechanism we can infer from human neuroimaging data and recent theoretical accounts of cortex? We also present data showing we can read out high-level auditory information from the activation patterns of early visual cortex even when visual cortex receives simple visual stimulation, suggesting independent channels for visual and auditory signals in V1. We speculate which cellular mechanisms allow V1 to be contextually modulated by auditory input to facilitate perception, cognition and behaviour. Beyond cortical feedback that facilitates perception, we argue that there is also feedback serving counterfactual processing during imagery, dreaming and mind wandering, which is not relevant for immediate perception but for behaviour and cognition over a longer time frame.This article is part of the themed issue ‘Auditory and visual scene analysis’.
Highlights
Sensory cortices receive domain-specific information through their primary afferent pathways, and information from the other senses via cortical feedback and top-down pathways [1]
What is the adaptive function of bimodal representations in visual cortex? What type of information projects from auditory to visual cortex? What are the anatomical constraints of auditory information in V1, for example, periphery versus fovea, superficial versus deep cortical layers? Is there a putative neural mechanism we can infer from human neuroimaging data and recent theoretical accounts of cortex? We present data showing we can read out high-level auditory information from the activation patterns of early visual cortex even when visual cortex receives simple visual stimulation, suggesting independent channels for visual and auditory signals in V1
The field must achieve a comprehensive framework of this assumption that includes testable theories. We begin in this direction by discussing the relevance of internal representations in facilitatory processing across the senses, suggesting that auditory signals in visual cortex assist in the spatial localization of visual inputs, but that they prepare for the type of visual input [3]
Summary
Sensory cortices receive domain-specific information through their primary afferent pathways, and information from the other senses via cortical feedback and top-down pathways [1] These multisensory activities in sensory cortices, for example auditory signatures in primary visual cortex [2], dispel the earlier theory that multisensory processing is restricted to higher cortex. The field must achieve a comprehensive framework of this assumption that includes testable theories We begin in this direction by discussing the relevance of internal representations in facilitatory processing across the senses, suggesting that auditory signals in visual cortex assist in the spatial localization of visual inputs, but that they prepare for the type of visual input [3]. We propose that multisensory processing in each sensory area should be considered as functionally discrete as each serves different gains for the brain
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have