Abstract
When reaching to grasp an object, we often move our arm and orient our gaze together. How are these movements coordinated? To investigate this question, we studied neuronal activity in the dorsal premotor area (PMd) and the medial intraparietal area (area MIP) of two monkeys while systematically varying the starting position of the hand and eye during reaching. PMd neurons encoded the relative position of the target, hand, and eye. MIP neurons encoded target location with respect to the eye only. These results indicate that whereas MIP encodes target locations in an eye-centered reference frame, PMd uses a relative position code that specifies the differences in locations between all three variables. Such a relative position code may play an important role in coordinating hand and eye movements by computing their relative position.
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have