Abstract

The Journal of Physiology symposium The Senses was held in San Diego on 22 October, 2004 as a satellite symposium to the annual meeting of the Society for Neuroscience. It brought together leading experts to highlight the similarities and differences in processing strategies for a variety of sensory systems such as vision, audition, somatosensation and olfaction. Over the course of the last hundred years, sensory physiology has been a big success story. In particular, the very earliest stage of sensory processing – transduction – is now very much understood for most modalities. These successful endeavors are indicated by a number of Nobel Prizes over the past few decades. Georg von Bekesy was awarded the Nobel Prize in 1961 ‘for his discoveries of the physical mechanism of stimulation within the cochlea’. In 1967, Ragnar Granit, George Wald and Keffer Hartline shared the Nobel Prize ‘for their discoveries concerning the primary physiological and chemical visual processes in the eye’. Soon after, David Hubel and Torsten Wiesel went one step further. ‘For their discoveries concerning information processing in the visual system’, they were awarded the Nobel Prize in 1981. Their work dealt with the question of what kinds of features are extracted from the stream of information delivered by elementary sensors. Most recently, the genetic basis for olfactory sensors and features was discovered by Linda Buck and Richard Axel. They were awarded the 2004 Nobel Prize ‘for their discoveries concerning odorant receptors and the organization of the olfactory system’. Despite all of this enormous progress, sensory physiologists are still far from having solved some of the major mysteries of our brain, namely how we use information from all modalities to recognize objects and how this information is used to guide our actions. These questions are addressed in this symposium- related issue of The Journal of Physiology. In the paper by Fuchs (2005), possible mechanisms explaining the remarkable ability of the hair cell's ribbon synapse to encode the precise timing and the intensity of a sound stimulus independently are explored. Callaway (2005) points out in his contribution how luminance and chromatic information are conveyed to the visual cortex in separate pathways, implying very specific connections between cone photoreceptors and thalamic neurones. Arguably even more important than merely recognizing things, sensory information is used to enable orientation in space and to form decisions about future actions. McAlpine (2005) presents new insights into how the mammalian brain uses minute differences in the timing of sounds arriving at the two ears (resolved thanks to the precision of the two ribbon synapses; Fuchs, 2005) to help create a sense of auditory space. Along the same lines of space perception, Bremmer (2005) shows how information from visual, auditory and somatosensory systems is integrated in the parietal cortex of monkeys and humans. Finally, Nicolelis (2005) characterizes how somatosensory computations are performed in ensembles of neurones in rats, and how these responses depend on the state of the animals. The contributions to this issue make it clear that similar coding strategies for converting elementary sensory information into features are used for quite different sensory modalities. However, it should also be clear that we have quite a long way to go towards understanding the crucial next steps from features to objects and actions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call