Abstract

Peripersonal space processing in monkeys’ brain relies on visuo-tactile neurons activated by objects near, not touching, the animal's skin. Multisensory interplay in peripersonal space is now well documented also in humans, in brain damaged patients presenting cross-modal extinction as well as in healthy subjects and typically takes the form of stronger visuo-tactile interactions in peripersonal than far space. We recently showed in healthy humans the existence of a functional link between voluntary object-oriented actions (Grasping) and the multisensory coding of the space around us (as indexed by visual–tactile interaction). Here, we investigated whether performing different actions towards the same object implies differential modulations of peripersonal space. Healthy subjects were asked to either grasp or point towards a target object. In addition, they discriminated whether tactile stimuli were delivered on their right index finger (up), or thumb (down), while ignoring visual distractors. Visuo-tactile interaction was probed in baseline Static conditions (before the movement) and in dynamic conditions (action onset and execution). Results showed that, compared to the Static baseline both actions similarly strengthened visuo-tactile interaction at the action onset, when Grasping and Pointing were kinematically indistinguishable. Crucially, Grasping induced further enhancement than Pointing in the execution phase, i.e., when the two actions kinematically diverged. These findings reveal that performing actions induce a continuous remapping of the multisensory peripersonal space as a function of on-line sensory–motor requirements, thus supporting the hypothesis of a role for peripersonal space in the motor control of voluntary actions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call