Abstract

The sensory signals generated by self-motion are complex and multimodal, but the ability to integrate these signals into a unified self-motion percept to guide navigation is essential for animal survival. Here, we summarize classic and recent work on self-motion coding in the visual and entorhinal cortices of the rodent brain. We compare motion processing in rodent and primate visual cortices, highlighting the strengths of classic primate work in establishing causal links between neural activity and perception, and discuss the integration of motor and visual signals in rodent visual cortex. We then turn to the medial entorhinal cortex (MEC), where calculations using self-motion to update position estimates are thought to occur. We focus on several key sources of self-motion information to MEC: the medial septum, which provides locomotor speed information; visual cortex, whose input has been increasingly recognized as essential to both position and speed-tuned MEC cells; and the head direction system, which is a major source of directional information for self-motion estimates. These inputs create a large and diverse group of self-motion codes in MEC, and great interest remains in how these self-motion codes might be integrated by MEC grid cells to estimate position. However, which signals are used in these calculations and the mechanisms by which they are integrated remain controversial. We end by proposing future experiments that could further our understanding of the interactions between MEC cells that code for self-motion and position and clarify the relationship between the activity of these cells and spatial perception.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call