Abstract

This study shows for the first time hand-shape decoding from human PPC. Unlike nonhuman primate studies in which the visual stimuli are the objects to be grasped, the visually cued hand shapes that we use are independent of the stimuli. Furthermore, we can show that distinct neuronal populations are activated for the visual cue and the imagined hand shape. Additionally we found that auditory and visual stimuli that cue the same hand shape are processed differently in PPC. Early on in a trial, only the visual stimuli and not the auditory stimuli can be decoded. During the later stages of a trial, the motor imagery for a particular hand shape can be decoded for both modalities.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call