Abstract

Our mental representations of our body are continuously updated through multisensory bodily feedback as we move and interact with our environment. Although it is often assumed that these internal models of body-representation are used to successfully act upon the environment, only a few studies have actually looked at how body-representation changes influence goal-directed actions, and none have looked at this in relation to body-representation changes induced by sound. The present work examines this question for the first time. Participants reached for a target object before and after adaptation periods during which the sounds produced by their hand tapping a surface were spatially manipulated to induce a representation of an elongated arm. After adaptation, participants’ reaching movements were performed in a way consistent with having a longer arm, in that their reaching velocities were reduced. These kinematic changes suggest auditory-driven recalibration of the somatosensory representation of the arm morphology. These results provide support to the hypothesis that one’s represented body size is used as a perceptual ruler to measure objects’ distances and to accordingly guide bodily actions.

Highlights

  • In our everyday interaction with the environment, most of us perform many physical actions that allow our body to reach, grab or point at different objects

  • Experiments on the “rubber-hand illusion” (RHI) demonstrated that a rubber arm can be incorporated into one’s body model, if one observes touches to the rubber arm while synchronously feeling touch delivered to their own, unseen, arm (Botvinick and Cohen, 1998)

  • Other studies have shown that feelings of ownership over a rubber hand can be elicited during active-touch conditions, Action Sounds Modulate Reaching Movements in which one delivers touch to the seen fake hand and in synchrony receives touch to one’s own unseen hand (Aimola Davies et al, 2010), and by synchronous seen and felt movement of a hand and one’s own unseen hand (Tsakiris et al, 2006; Newport et al, 2010; Sánchez-Vives et al, 2010; Kalckert and Ehrsson, 2012), highlighting the influence of proprioceptive cues in this illusion

Read more

Summary

Introduction

In our everyday interaction with the environment, most of us perform many physical actions that allow our body to reach, grab or point at different objects. For most of us these actions seem to occur smoothly, mostly automatically This successful and smooth interaction with the environment is enabled by the use of internal models of body shape and posture (Head and Holmes, 1911–1912; Maravita and Iriki, 2004). Experiments on the “rubber-hand illusion” (RHI) demonstrated that a rubber arm can be incorporated into one’s body model, if one observes touches to the rubber arm while synchronously feeling touch delivered to their own, unseen, arm (Botvinick and Cohen, 1998). This illusion results from the integration of congruent sensory information received through vision and touch. Other studies have shown that feelings of ownership over a rubber hand can be elicited during active-touch conditions, Action Sounds Modulate Reaching Movements in which one delivers touch to the seen fake hand and in synchrony receives touch to one’s own unseen hand (Aimola Davies et al, 2010), and by synchronous seen and felt movement of a hand and one’s own unseen hand (Tsakiris et al, 2006; Newport et al, 2010; Sánchez-Vives et al, 2010; Kalckert and Ehrsson, 2012), highlighting the influence of proprioceptive cues in this illusion

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call