Abstract

In the study of bodily awareness, the predictive coding theory has revealed that our brain continuously modulates sensory experiences to integrate them into a unitary body representation. Indeed, during multisensory illusions (e.g., the rubber hand illusion, RHI), the synchronous stroking of the participant's concealed hand and a fake visible one creates a visuotactile conflict, generating a prediction error. Within the predictive coding framework, through sensory processing modulation, prediction errors are solved, inducing participants to feel as if touches originated from the fake hand, thus ascribing the fake hand to their own body. Here, we aimed to address sensory processing modulation under multisensory conflict, by disentangling somatosensory and visual stimuli processing that are intrinsically associated during the illusion induction. To this aim, we designed two EEG experiments, in which somatosensory- (SEPs; Experiment 1; N = 18; F = 10) and visual-evoked potentials (VEPs; Experiment 2; N = 18; F = 9) were recorded in human males and females following the RHI. Our results show that, in both experiments, ERP amplitude is significantly modulated in the illusion as compared with both control and baseline conditions, with a modality-dependent diametrical pattern showing decreased SEP amplitude and increased VEP amplitude. Importantly, both somatosensory and visual modulations occur in long-latency time windows previously associated with tactile and visual awareness, thus explaining the illusion of perceiving touch at the sight location. In conclusion, we describe a diametrical modulation of somatosensory and visual processing as the neural mechanism that allows maintaining a stable body representation, by restoring visuotactile congruency under the occurrence of multisensory conflicts.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call