Abstract

Spatial information processing takes place in different brain regions that receive converging inputs from several sensory modalities. Because of our own movements—for example, changes in eye position, head rotations, and so forth—unimodal sensory representations move continuously relative to one another. It is generally assumed that for multisensory integration to be an orderly process, it should take place between stimuli at congruent spatial locations. In the monkey posterior parietal cortex, the ventral intraparietal (VIP) area is specialized for the analysis of movement information using visual, somatosensory, vestibular, and auditory signals. Focusing on the visual and tactile modalities, we found that in area VIP, like in the superior colliculus, multisensory signals interact at the single neuron level, suggesting that this area participates in multisensory integration. Curiously, VIP does not use a single, invariant coordinate system to encode locations within and across sensory modalities. Visual stimuli can be encoded with respect to the eye, the head, or halfway between the two reference frames, whereas tactile stimuli seem to be prevalently encoded relative to the body. Hence, while some multisensory neurons in VIP could encode spatially congruent tactile and visual stimuli independently of current posture, in other neurons this would not be the case. Future work will need to evaluate the implications of these observations for theories of optimal multisensory integration.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call