Abstract

Early models of multisensory integration posited that cross-modal signals only converged in higher-order association cortices and that vision automatically dominates. However, recent studies have challenged this view. In this study, the significance of the alignment of motion axes and spatial alignment across visual and tactile stimuli, as well as the effect of hand visibility on visuo-tactile interactions were examined. Using binocular rivalry, opposed motions were presented to each eye and participants were required to track the perceived visual direction. A tactile motion that was either a leftward or rightward sweep across the fingerpad was intermittently presented. Results showed that tactile effects on visual percepts were dependent on the alignment of motion axes: rivalry between up/down visual motions was not modulated at all by left/right tactile motion. On the other hand, visual percepts could be altered by tactile motion signals when both modalities shared a common axis of motion: a tactile stimulus could maintain the dominance duration of a congruent visual stimulus and shorten its suppression period. The effects were also conditional on the spatial alignment of the visual and tactile stimuli, being eliminated when the tactile device was displaced 15 cm away to the right of the visual stimulus. In contrast, visibility of the hand touching the tactile stimulus facilitated congruent switches relative to a visual-only baseline but did not present a significant advantage overall. In sum, these results show a low-level sensory interaction that is conditional on visual and tactile stimuli sharing a common motion axis and location in space.

Highlights

  • In order to perceive our surroundings in a robust and coherent manner, the brain must integrate sensory information within and between modalities (Alais, Newell, & Mamassian, 2010a; Ernst & Bülthoff, 2004)

  • Follow-up pairwise contrasts for each model ran indicated a significant increase in the probability of visual percepts switching to match that of the tactile stimulus only when both visual and tactile stimuli shared a common axis of motion and when they appeared to be spatially aligned (Parallel motion axes condition for the effect of direction selectivity [Fig. 4a; Left]: z = -3.67, p < .001; Spatially aligned condition for the effect of spatial alignment [Fig. 4b; Left]: z = -2.87, p = .004). These results suggest that the tactile stimulus was able to curtail suppression periods for direction-congruent visual stimuli when the motion axes and spatial location across both modalities coincided regardless of hand visibility

  • The results suggest that congruent tactile stimulation mainly stabilised rivalry dynamics by promoting dominance of an already congruent visual percept

Read more

Summary

Introduction

In order to perceive our surroundings in a robust and coherent manner, the brain must integrate sensory information within and between modalities (Alais, Newell, & Mamassian, 2010a; Ernst & Bülthoff, 2004). The question of where sensory signals are combined along processing pathways is a matter of debate. Models held that multisensory integration occurred beyond primary sensory cortices in multisensory association areas. Analogous effects across vision and touch have previously been reported for dynamic signals, highlighting similarities in visuo-tactile motion processing (Carter et al, 2008; Gori et al, 2011). Both vision and touch share comparable organisational principles when extracting and encoding motion signals over space and time

Objectives
Methods
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call