Abstract
Understanding the interactions of visual and proprioceptive information in tool use is important as it is the basis for learning of the tool's kinematic transformation and thus skilled performance. This study investigated how the CNS combines seen cursor positions and felt hand positions under a visuo-motor rotation paradigm. Young and older adult participants performed aiming movements on a digitizer while looking at rotated visual feedback on a monitor. After each movement, they judged either the proprioceptively sensed hand direction or the visually sensed cursor direction. We identified asymmetric mutual biases with a strong visual dominance. Furthermore, we found a number of differences between explicit and implicit judgments of hand directions. The explicit judgments had considerably larger variability than the implicit judgments. The bias toward the cursor direction for the explicit judgments was about twice as strong as for the implicit judgments. The individual biases of explicit and implicit judgments were uncorrelated. Biases of these judgments exhibited opposite sequential effects. Moreover, age-related changes were also different between these judgments. The judgment variability was decreased and the bias toward the cursor direction was increased with increasing age only for the explicit judgments. These results indicate distinct explicit and implicit neural representations of hand direction, similar to the notion of distinct visual systems.
Highlights
Without a tool, the position of the hand is monitored both visually and proprioceptively, and both modalities are integrated to obtain a single estimate of hand position [1]
In tool-use actions, such as controlling a cursor on a monitor through a computer mouse, visual information specifies the position of the effective part of the tool, while proprioceptive information specifies the position of the hand
Visual dominance is likely related to the higher precision of visual than of proprioceptive spatial information
Summary
The position of the hand is monitored both visually and proprioceptively, and both modalities are integrated to obtain a single estimate of hand position [1]. Recent evidence shows that the weights in averaging different sources of information match their relative inverse variances [2,3]. In tool-use actions, such as controlling a cursor on a monitor through a computer mouse, visual information specifies the position of the effective part of the tool (i.e., cursor), while proprioceptive information specifies the position of the hand. These two positions of different objects have a clear spatial separation. It can serve to reduce the variances of the biased estimates, and to enhance the precision even when visual and proprioceptive information refer to different objects
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have