Abstract

Many distinct spaces surround our bodies. Most schematically, the key division is between peripersonal space (PPS), the close space surrounding our body, and an extrapersonal space, which is the space out of one’s reach. The PPS is considered as an action space, which allows us to interact with our environment by touching and grasping. In the current scientific literature, PPS’ visual representations are appearing as mere bubbles of even dimensions wrapped around the body. Although more recent investigations of PPS’ upper body (trunk, head, and hands) and lower body (legs and foot) have provided new representations, no investigation has been made yet concerning the estimation of PPS’s overall representation in 3D. Previous findings have demonstrated how the relationship between tactile processing and the location of sound sources in space is modified along a spatial continuum. These findings suggest that similar methods can be used to localize the boundaries of the subjective individual representation of PPS. Hence, we designed a behavioral paradigm in virtual reality based on audio-tactile interactions, which has enabled us to infer a detailed individual 3D audio-tactile representation of PPS. Considering that inadequate body-related multisensory integration processes can produce incoherent spatio–temporal perception, the development of a virtual reality setup and a method to estimate the representation of the subjective PPS volumetric boundaries will be a valuable addition for the comprehension of the mismatches occurring between body physical boundaries and body schema representations in 3D.

Highlights

  • In the last two decades, we have witnessed a rising interest in neuroscience regarding cross-modal and multisensory body representations (Maravita et al, 2003; Holmes et al, 2004) and their influences on the mental division of external spaces and 3D spatial interactions (Grüsser, 1983; Previc, 1990; Previc, 1998; Cutting and Vishton, 1995; Maravita et al, 2004; De Vignemont and Iannetti, 2015; Postma et al, 2016)

  • The topic concerning peripersonal space (PPS) spatial representation has been explored in different domains of visual arts over the centuries, performing arts, and scientific domains

  • By connecting the reaction times (RT) thresholds in each of the twelve directions, we draw a spatial polyhedron which serves as an approximation for PPS and its boundaries in 3D

Read more

Summary

Introduction

In the last two decades, we have witnessed a rising interest in neuroscience regarding cross-modal and multisensory body representations (Maravita et al, 2003; Holmes et al, 2004) and their influences on the mental division of external spaces and 3D spatial interactions (Grüsser, 1983; Previc, 1990; Previc, 1998; Cutting and Vishton, 1995; Maravita et al, 2004; De Vignemont and Iannetti, 2015; Postma et al, 2016). We chose to focus on body schemas egocentric representations as insofar it represent how the body dictates the movement it performs (De Vignemont, 2010; De Vignemont, 2018) and is an unconscious experience of spatiality which relies on multisensory integration mechanism closely involved in the dynamic representations of PPS spatial encoding (Spence et al, 2008; Brozzoli et al, 2012), which interest us. PPS, the close space surrounding the body (Brain, 1941; Previc, 1988; Rizzolatti et al, 1997; Noel et al, 2015a; Di Pellegrino and Làdavas, 2015; Graziano, 2017; Hunley et al, 2018), can be traced back in visual representation all the way to the drawing of the “Vitruvian man” by Leonardo da Vinci, which depicted the human body anatomical configuration and proportions (1490). In the turn of twentieth century, the choreographer Rudolph Laban’s choreutics theory (Von Laban, 1966) has linked his studies of movement with Pythagorean mathematics and formulated the concept of a kinesphere in order to characterize the space surrounding one’s body “within reaching possibilities of the limbs without changing one’s place” (Dell et al, 1977)

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call