Abstract

Human-environment interactions normally occur in the physical milieu, and thus by medium of the body and within the space immediately adjacent to and surrounding the body; the peri-personal space (PPS). However, human interactions increasingly occur with or within virtual environments, and hence novel approaches and metrics must be developed to index human-environment interactions in virtual reality (VR). Here we present a multisensory task that measures the spatial extent of human PPS in real, virtual, and augmented realities. We validated it in a mixed reality ecosystem in which real environment and virtual objects are blended together in order to administer and control visual, auditory, and tactile stimuli in ecologically valid conditions. Within this mixed-reality environment, participants are asked to respond as fast as possible to tactile stimuli on their body, while task-irrelevant visual or audio-visual stimuli approach their body. Results demonstrate that, in analogy with observations derived from monkey electrophysiology and in real environmental surroundings, tactile detection is enhanced when visual or auditory stimuli are close to the body, and not when far from it. We then calculate the location where this multisensory facilitation occurs as a proxy of the boundary of PPS. We observe that mapping of PPS via audio-visual, as opposed to visual alone, looming stimuli results in sigmoidal fits – allowing for the bifurcation between near and far space – with greater goodness of fit. In sum, our approach is able to capture the boundaries of PPS on a spatial continuum, at the individual-subject level, and within a fully controlled and previously laboratory-validated setup, while maintaining the richness and ecological validity of real-life events. The task can therefore be applied to study the properties of peri-personal space in humans and to index the features governing human-environment interactions in virtual or mixed reality. We propose PPS as an ecologically valid and neurophysiologically established metric in the study of the impact of VR and related technologies on society and individuals.

Highlights

  • The manner in which the brain integrates information from different senses in order to boost perception and guide actions is a major research topic in cognitive neuroscience (Calvert et al, 2004; Spence and Driver, 2004; Stein, 2012) and a topic of increasing interest in the design of virtual environments

  • reaction times (RTs) in the visuotactile condition was faster than the fastest unimodal RT when tactile stimulation was associated with a virtual ball at D1, D2, and D3, and not when the ball was at father distance, i.e., D4, D5, and D6

  • We present how the boundaries of peripersonal space (PPS) can be measured in terms of spatially dependent modulation of multisensory responses with a simple behavioral task that can be conducted with participants immersed in a mixed reality (MR) environment

Read more

Summary

Introduction

The manner in which the brain integrates information from different senses in order to boost perception and guide actions is a major research topic in cognitive neuroscience (Calvert et al, 2004; Spence and Driver, 2004; Stein, 2012) and a topic of increasing interest in the design of virtual environments. The manipulation of bodily inputs has been used to induce the feeling that an artificial or virtual body is one’s own and to generate the sensation of being located within a virtual environment (Tsakiris, 2010; Blanke, 2012; Ehrsson, 2012; Serino et al, 2013; Noel et al, 2015b; Salomon et al, 2017). These findings highlight the relevant role of bodily inputs for virtual reality (VR) (Herbelin et al, 2016). We propose and demonstrate that it is possible to delineate and measure a representation of PPS within virtual and mixed reality (MR) environments

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call