Abstract

Vision and proprioception, informing the system about the body position in space, seem crucial in defining the boundary of the peripersonal space (PPS). What happens to the PPS representation when a conflict between vision and proprioception arises? We capitalize on the Immersive Virtual Reality to dissociate vision and proprioception by presenting the participants' 3D hand image in congruent/incongruent positions with respect to the participants’ real hand. To measure the hand-centred PPS, we exploit multisensory integration occurring when visual stimuli are delivered simultaneously with tactile stimuli applied to a body district; i.e., visual enhancement of touch (VET). Participants are instructed to respond to tactile stimuli while ignoring visual stimuli (red LED), which can appear either near to or far from the hand receiving tactile (electrical) stimuli. The results show that, when vision and proprioception are congruent (i.e., real and virtual hand coincide), a space-dependent modulation of the VET effect occurs (with faster responses when visual stimuli are near to than far from the stimulated hand). Contrarily, when vision and proprioception are incongruent (i.e., a discrepancy between real and virtual hand is present), a comparable VET effect is observed when visual stimuli occur near to the real hand and when they occur far from it, but close to the virtual hand. These findings, also confirmed by the independent estimate of a Bayesian Causal Inference model, suggest that, when the visuo-proprioceptive discrepancy makes the coding of the hand position less precise, the hand-centred PPS is enlarged, likely to optimize reactions to external events.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call