During active movement, there is normally a tight relation between motor command and sensory representation about the resulting spatial displacement of the body. Indeed, some theories of space perception emphasize the topographic layout of sensory receptor surfaces, while others emphasize implicit spatial information provided by the intensity of motor command signals. To identify which has the primary role in spatial perception, we developed experiments based on everyday self-touch, in which the right hand strokes the left arm. We used a robot-mediated form of self-touch to decouple the spatial extent of active or passive right hand movements from their tactile consequences. Participants made active movements of the right hand between unpredictable, haptically defined start and stop positions, or the hand was passively moved between the same positions. These movements caused a stroking tactile motion by a brush along the left forearm, with minimal delay, but with an unpredictable spatial gain factor. Participants judged the spatial extent of either the right hand's movement, or of the resulting tactile stimulation to their left forearm. Across five experiments, we found that movement extent strongly interfered with tactile extent perception, and vice versa. Crucially, interference in both directions was stronger during active than passive movements. Thus, voluntary motor commands produced stronger integration of multiple sensorimotor signals underpinning the perception of personal space. Our results prompt a reappraisal of classical theories that reduce space perception to motor command information.
Read full abstract