Abstract

Generic properties of curvature representations formed on the basis of vision and touch were examined as a function of mathematical properties of curved objects. Virtual representations of the curves were shown on a computer screen for visual scaling by sighted observers (experiment 1). Their physical counterparts were placed in the two hands of blindfolded and congenitally blind observers for tactile scaling. The psychophysical data show that curvature representations in congenitally blind individuals, who never had any visual experience, and in sighted observers, who rely on vision most of the time, are statistically linked to the same mathematical properties of the curves. The perceived magnitude of object curvature, sensed through either vision or touch, is related by a mathematical power law, with similar exponents for the two sensory modalities, to the aspect ratio of the curves, a scale invariant geometric property. This finding supports biologically motivated models of sensory integration suggesting a universal power law for the adaptive brain control and balance of motor responses to environmental stimuli from any sensory modality.

Highlights

  • Interaction of the human body with technological devices relies on the multisensory integration of visual and tactile signals by the human brain, as in the use of global positioning systems for navigation, or the encoding of visual and tactile spatial information for laparoscopic surgery, for example

  • We looked for generic properties of spatial representations formed on the basis of vision and touch related to the mathematical properties of curved objects, represented on a computer screen for visual exploration, and their physical counterparts placed in the two hands of blindfolded and congenitally blind observers for tactile exploration

  • The findings from this study show that curvature representations in congenitally blind individuals, who never had any visual experience, and in seeing observers, who take in the physical world through their visual systems most of the time, are statistically linked to the same mathematical properties [23,24,25] of curved objects, whether these are sensed visually on the basis of virtual representations on a computer screen or directly by the two hands on the basis of real-world objects

Read more

Summary

Introduction

Interaction of the human body with technological devices relies on the multisensory integration of visual and tactile signals by the human brain, as in the use of global positioning systems for navigation, or the encoding of visual and tactile spatial information for laparoscopic surgery, for example. Experimental studies have shown that the manipulation of visual objects with the two hands and the visual and tactile integration of shape information play an important role in action planning as well as motor control [1, 2]. This is important in spatial perception by the blind who never had visual experience (congenitally blind people) but who are perfectly capable of understanding the physical environments which surround them and of forming exact representations of complex spatial geometry. Physical and perceptual models, tested in the light of statistical probabilities, are needed to extend our knowledge on how visual and tactile brain representations function, how they interact, and what they have in common

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call