Abstract

Some visual-tactile (bimodal) cells have visual receptive fields (vRFs) that overlap and extend moderately beyond the skin of the hand. Neurophysiological evidence suggests, however, that a vRF will grow to encompass a hand-held tool following active tool use but not after passive holding. Why does active tool use, and not passive holding, lead to spatial adaptation near a tool? We asked whether spatial adaptation could be the result of motor or visual experience with the tool, and we distinguished between these alternatives by isolating motor from visual experience with the tool. Participants learned to use a novel, weighted tool. The active training group received both motor and visual experience with the tool, the passive training group received visual experience with the tool, but no motor experience, and finally, a no-training control group received neither visual nor motor experience using the tool. After training, we used a cueing paradigm to measure how quickly participants detected targets, varying whether the tool was placed near or far from the target display. Only the active training group detected targets more quickly when the tool was placed near, rather than far, from the target display. This effect of tool location was not present for either the passive-training or control groups. These results suggest that motor learning influences how visual space around the tool is represented.

Highlights

  • It is clear that one of the roles of multisensory neurons is to integrate visual, tactile, and proprioceptive information so that we can track where objects are located relative to our limbs, even when we are not looking at them directly

  • Neurophysiological evidence suggests that the visual receptive field of visual-tactile bimodal neurons can grow to encompass a hand-held tool, but that this spatial adaptation follows active tool use, not passive holding [1]

  • Tool-related benefits may depend on motor learning We found that near-tool benefits were measurable when visual experience with the tool was paired with motor experience

Read more

Summary

Introduction

It is clear that one of the roles of multisensory neurons is to integrate visual, tactile, and proprioceptive information so that we can track where objects are located relative to our limbs, even when we are not looking at them directly. Some bimodal visual-tactile neurons, discovered in the monkey ventral premotor cortex (PMv) and the intraparietal sulcus, have overlapping visual and tactile receptive fields (vRFs and tRFs, respectively), typically on the face or hand [4,5,6,7,8,9]. Some of these neurons receive proprioceptive information [6,7,8,9] and their visual receptive fields are linked to hand motion such that the visual receptive field moves with the hand [6,7]. Visual information presented near the hands, i.e. in peripersonal or pericutaneous space [10], may recruit bimodal neurons, whereas visual information presented away from the hands may not recruit bimodal neurons

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call