Abstract

Neural circuits generate representations of the external world from multiple information streams. The navigation system provides an exceptional lens through which we may gain insights about how such computations are implemented. Neural circuits in the medial temporal lobe construct a map-like representation of space that supports navigation. This computation integrates multiple sensory cues, and, in addition, is thought to require cues related to the individual’s movement through the environment. Here, we identify multiple self-motion signals, related to the position and velocity of the head and eyes, encoded by neurons in a key node of the navigation circuitry of mice, the medial entorhinal cortex (MEC). The representation of these signals is highly integrated with other cues in individual neurons. Such information could be used to compute the allocentric location of landmarks from visual cues and to generate internal representations of space.

Highlights

  • Neural circuits generate representations of the external world from multiple information streams

  • To assess how medial entorhinal cortex (MEC) neural activity varies with 3D head movement alongside previously recognized navigational variables, we fit a series of linear-nonlinear (LN) Poisson models to the spiketrain of each cell[27] (Methods, Supplementary Fig. 2)

  • Our experiments revealed neural activity in MEC correlated with the position and movement of the head and eyes about multiple axes, in addition to the previously reported body position, body speed, and azimuthal head direction signals

Read more

Summary

Introduction

Neural circuits generate representations of the external world from multiple information streams. Neural circuits in the medial temporal lobe construct a map-like representation of space that supports navigation This computation integrates multiple sensory cues, and, in addition, is thought to require cues related to the individual’s movement through the environment. We identify multiple self-motion signals, related to the position and velocity of the head and eyes, encoded by neurons in a key node of the navigation circuitry of mice, the medial entorhinal cortex (MEC). The representation of these signals is highly integrated with other cues in individual neurons. Using two experimental set-ups in which we either track the 3D position and velocity of the head during random foraging, or the position and velocity of the eye during head-fixed navigation, we identified neural activity in the MEC associated with the pitch and roll position and angular azimuthal velocity of the head, as well as position and velocity of the eyes

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call