Abstract

A subset of sensory substitution (SS) devices translate images into sounds in real time using a portable computer, camera, and headphones. Perceptual constancy is the key to understanding both functional and phenomenological aspects of perception with SS. In particular, constancies enable object externalization, which is critical to the performance of daily tasks such as obstacle avoidance and locating dropped objects. In order to improve daily task performance by the blind, and determine if constancies can be learned with SS, we trained blind (N = 4) and sighted (N = 10) individuals on length and orientation constancy tasks for 8 days at about 1 h per day with an auditory SS device. We found that blind and sighted performance at the constancy tasks significantly improved, and attained constancy performance that was above chance. Furthermore, dynamic interactions with stimuli were critical to constancy learning with the SS device. In particular, improved task learning significantly correlated with the number of spontaneous left-right head-tilting movements while learning length constancy. The improvement from previous head-tilting trials even transferred to a no-head-tilt condition. Therefore, not only can SS learning be improved by encouraging head movement while learning, but head movement may also play an important role in learning constancies in the sighted. In addition, the learning of constancies by the blind and sighted with SS provides evidence that SS may be able to restore vision-like functionality to the blind in daily tasks.

Highlights

  • One class of sensory substitution (SS) devices encodes an image into tactile or auditory stimuli to allow vision-like perception after the onset of blindness

  • This study had the goal of training the sighted and blind on two 2D constancies and understanding any training elements that aided in that learning

  • We found that the sighted and blind were able to identify line angles independent of head tilt and identify line length independent of line angle (2D orientation and length constancy, respectively)

Read more

Summary

Introduction

One class of sensory substitution (SS) devices encodes an image into tactile or auditory stimuli to allow vision-like perception after the onset of blindness. Several studies have shown that both blind and sighted users of SS devices, such as the vOICe, can obtain the functional aspects of vision via crossmodal plasticity (Bach-Y-Rita et al, 1998; Renier et al, 2005; Amedi et al, 2007; Poirier et al, 2007a; Proulx et al, 2008; Plaza et al, 2009; Striem-Amit et al, 2012). Siegle et al showed that distal attribution, or perception of object external space, can be perceived with a tactile SS device (Siegle and Warren, 2010). Ward and Wright discussed the sensorimotor similarities and differences between SS perception and traditional visual perception and the concept of “embodiment,” or incorporation of the SS camera into bodily perception (Ward and Wright, 2014)

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call