Abstract

Place and grid cells in the hippocampal formation provide foundational representations of environmental location, and potentially of locations within conceptual spaces. Some accounts predict that environmental sensory information and self-motion are encoded in complementary representations, while other models suggest that both features combine to produce a single coherent representation. Here, we use virtual reality to dissociate visual environmental from physical motion inputs, while recording place and grid cells in mice navigating virtual open arenas. Place cell firing patterns predominantly reflect visual inputs, while grid cell activity reflects a greater influence of physical motion. Thus, even when recorded simultaneously, place and grid cell firing patterns differentially reflect environmental information (or ‘states’) and physical self-motion (or ‘transitions’), and need not be mutually coherent.

Highlights

  • Place and grid cells in the hippocampal formation provide foundational representations of environmental location, and potentially of locations within conceptual spaces

  • We examined the spatial firing patterns of place cells from hippocampal region CA1 and grid cells from medial Entorhinal cortex in 2-d virtual reality (VR), focussing on probe trials in which the visual ‘gain’ (G) applied to one axis of virtual movement was both increased (G = 2) and decreased (G = 2/3) compared to the baseline condition (G = 1)

  • By changing the gain of the movement of the visual projection in VR relative to the physical motion of the mouse, we have shown that the 2-d spatial firing patterns of CA1 place cells are more strongly influenced by visual inputs whereas those of grid cells show a greater influence of physical motion

Read more

Summary

Introduction

Place and grid cells in the hippocampal formation provide foundational representations of environmental location, and potentially of locations within conceptual spaces. We use virtual reality to dissociate visual environmental from physical motion inputs, while recording place and grid cells in mice navigating virtual open arenas. Virtual reality (VR) can be used to manipulate the relationship between physical (motoric/proprioceptive) self-motion signals and environmental visual information (including both identifiable landmarks and optic flow) so that their relative influences can be identified This approach has been used on 1-dimentional (1-d) virtual tracks while recording from place cells[17] or grid cells[18], suggesting that both types of input can influence the pattern of firing along the track in both types of cells, in ways that vary across cells[17] and conditions[18], see Discussion. We used a VR system for mice, following a similar system for rats[19,20], which allows navigation and expression of spatial firing patterns within 2-d open field virtual environments[21]

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call