Abstract

This study examines how artificial tactile stimulation from a novel noninvasive sensory device is learned and integrated with information from the visual sensory system. In our experiment, visual direction information was paired with reliable symbolic tactile information. Over several training blocks, discrimination performance in unimodal tactile test trials and participants’ confidence in their decision improved, indicating that participants could associate the visual and tactile information consciously and thus, learned the meaning of the symbolic tactile cues. Our results showed that information from both modalities is integrated during the early learning phase. Even though this integration is consistent with a Bayesian integration model, under certain conditions, it may even lead to nonoptimal perception such that participants’ performance is worse than if they were using only a single cue. Furthermore, we showed that a confidence-based Bayesian integration explains the observed behavioral data better than the classical variance-based Bayesian integration. The present study demonstrates that humans can consciously learn and integrate an artificial sensory device providing symbolic tactile information. We also shed light on how Bayesian integration can lead to nonoptimal perception by providing additional models and simulations. Our finding connects the field of multisensory integration to the development of sensory substitution systems.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call