Abstract

Movements are defining characteristics of all behaviors. Animals walk around, move their eyes to explore the world or touch structures to learn more about them. So far we only have some basic understanding of how the brain generates movements, especially when we want to understand how different areas of the brain interact with each other. In this study we investigated the influence of sensory object information on grasp planning in four different brain areas involved in vision, touch, movement planning, and movement generation in the parietal, somatosensory, premotor and motor cortex. We trained one monkey to grasp objects that he either saw or touched beforehand while continuously recording neural spiking activity with chronically implanted floating multi-electrode arrays. The animal was instructed to sit in the dark and either look at a shortly illuminated object or reach out and explore the object with his hand in the dark before lifting it up. In a first analysis we confirmed that the animal not only memorizes the object in both tasks, but also applies an object-specific grip type, independent of the sensory modality. In the neuronal population, we found a significant difference in the number of tuned units for sensory modalities during grasp planning that persisted into grasp execution. These differences were sufficient to enable a classifier to decode the object and sensory modality in a single trial exclusively from neural population activity. These results give valuable insights in how different brain areas contribute to the preparation of grasp movement and how different sensory streams can lead to distinct neural activity while still resulting in the same action execution.

Highlights

  • Sensory-motor transformation flexibly links sensory information from several sensory modalities to meaningful action activations

  • For example if an object is seen, visual information is processed through visual areas like the primary visual cortex V1 or the anterior intraparietal area anterior intraparietal cortex (AIP), where it might serve as a basis to select future actions (Snowden et al, 1991; Murata et al, 2000; Lehmann and Scherberger, 2015; Schaffelhofer and Scherberger, 2016; Self et al, 2019)

  • We first tried to evaluate whether the animal would use the sensory information he collected during the cue period

Read more

Summary

Introduction

Sensory-motor transformation flexibly links sensory information from several sensory modalities to meaningful action activations. Baumann et al (2009) demonstrated that neurons in AIP encode object orientations as well as grip types during a delayed grasping task with a visually presented target. It is an important area for the processing of object interactions (Taira et al, 1990; Sakata et al, 1995; Borra et al, 2007; Lehmann and Scherberger, 2013; Schaffelhofer, 2014; Schaffelhofer and Scherberger, 2016)

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call