Abstract

Object-vector (OV) cells are cells in the medial entorhinal cortex (MEC) that track an animal’s distance and direction to objects in the environment. Their firing fields are defined by vectorial relationships to free-standing 3-dimensional (3D) objects of a variety of identities and shapes. However, the natural world contains a panorama of objects, ranging from discrete 3D items to flat two-dimensional (2D) surfaces, and it remains unclear what are the most fundamental features of objects that drive vectorial responses. Here we address this question by systematically changing features of experimental objects. Using an algorithm that robustly identifies OV firing fields, we show that the cells respond to a variety of 2D surfaces, with visual contrast as the most basic visual feature to elicit neural responses. The findings suggest that OV cells use plain visual features as vectorial anchoring points, allowing vector-guided navigation to proceed in environments with few free-standing landmarks.

Highlights

  • Object-vector (OV) cells are cells in the medial entorhinal cortex (MEC) that track an animal’s distance and direction to objects in the environment

  • OV cells respond to an impressive array of objects: from small to large, narrow to wide, and short to tall, they are inclined to respond to taller objects[23]

  • OV cells were identified by an ‘OV score’, equivalent to the Pearson correlation between object-centred pairs of rate maps from the ‘Object’ and ‘Moved Object’ trials

Read more

Summary

Introduction

Object-vector (OV) cells are cells in the medial entorhinal cortex (MEC) that track an animal’s distance and direction to objects in the environment. 1234567890():,; Medial entorhinal cortex (MEC) and the adjacent preand parasubiculum are critical components of the neural representation of space[1,2], hosting cell types that dynamically signal the animal’s position[3,4,5], head direction[5,6,7], speed[8] and proximity to borders[5,9,10] These cells coexist and interact with a wide range of allocentrically tuned cell types in neighbouring hippocampal–parahippocampal regions, such as place cells[11,12], boundary-vector (BV) cells[13,14,15] and landmarkcontrolled cells[16,17,18], as well as various cells that encode position in egocentric coordinates relative to the animal’s body axis[19,20,21,22]. After finding that 2D surfaces on the wall of the recording environment are sufficient to elicit OV firing, we show that a simple visual contrast on the wall is sufficient to elevate activity

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call