Abstract

Earlier studies have revealed cross-modal visuo-tactile interactions in endogenous spatial attention. The current research used event-related potentials (ERPs) and virtual reality (VR) to identify how the visual cues of the perceiver’s body affect visuo-tactile interaction in endogenous spatial attention and at what point in time the effect takes place. A bimodal oddball task with lateralized tactile and visual stimuli was presented in two VR conditions, one with and one without visible hands, and one VR-free control with hands in view. Participants were required to silently count one type of stimulus and ignore all other stimuli presented in irrelevant modality or location. The presence of hands was found to modulate early and late components of somatosensory and visual evoked potentials. For sensory-perceptual stages, the presence of virtual or real hands was found to amplify attention-related negativity on the somatosensory N140 and cross-modal interaction in somatosensory and visual P200. For postperceptual stages, an amplified N200 component was obtained in somatosensory and visual evoked potentials, indicating increased response inhibition in response to non-target stimuli. The effect of somatosensory, but not visual, N200 enhanced when the virtual hands were present. The findings suggest that bodily presence affects sustained cross-modal spatial attention between vision and touch and that this effect is specifically present in ERPs related to early- and late-sensory processing, as well as response inhibition, but do not affect later attention and memory-related P3 activity. Finally, the experiments provide commeasurable scenarios for the estimation of the signal and noise ratio to quantify effects related to the use of a head mounted display (HMD). However, despite valid a-priori reasons for fearing signal interference due to a HMD, we observed no significant drop in the robustness of our ERP measurements.

Highlights

  • Our ability to focus on a specific location while ignoring events occurring in other directions is a vital requirement for successful interaction with the surrounding world

  • We calculated the noise as the effect size of the relevant modality in the baseline and the signal as the same comparison except within an area of similar length as the noise interval

  • Artefactual data has been removed from the analysis using visual inspection, but as artifacts tend to cause extreme voltages, it is more common to apply a threshold for the absolute amplitude or largest difference value within epochs

Read more

Summary

Introduction

Our ability to focus on a specific location while ignoring events occurring in other directions is a vital requirement for successful interaction with the surrounding world. Even if the participants should completely ignore stimuli in the irrelevant modality and merely respond to, for example, left vibrations, visual ERPs show enhanced processing if they appear in the relevant location (left). Such crossmodal interactions have been observed in various modalities: as suggested between vision and audition, as well as between vision and audition and audition and touch (Teder-Sälejärvi et al, 1999; Eimer et al, 2002)

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call