Abstract

SummaryAnimals actively interact with their environment to gather sensory information. There is conflicting evidence about how mice use vision to sample their environment. During head restraint, mice make rapid eye movements coupled between the eyes, similar to conjugate saccadic eye movements in humans. However, when mice are free to move their heads, eye movements are more complex and often non-conjugate, with the eyes moving in opposite directions. We combined head and eye tracking in freely moving mice and found both observations are explained by two eye-head coupling types, associated with vestibular mechanisms. The first type comprised non-conjugate eye movements, which compensate for head tilt changes to maintain a similar visual field relative to the horizontal ground plane. The second type of eye movements was conjugate and coupled to head yaw rotation to produce a “saccade and fixate” gaze pattern. During head-initiated saccades, the eyes moved together in the head direction but during subsequent fixation moved in the opposite direction to the head to compensate for head rotation. This saccade and fixate pattern is similar to humans who use eye movements (with or without head movement) to rapidly shift gaze but in mice relies on combined head and eye movements. Both couplings were maintained during social interactions and visually guided object tracking. Even in head-restrained mice, eye movements were invariably associated with attempted head motion. Our results reveal that mice combine head and eye movements to sample their environment and highlight similarities and differences between eye movements in mice and humans.

Highlights

  • During natural behaviors, animals actively sample their sensory environment [1, 2]

  • To investigate eye/head movement relations, we used a system that we recently developed for tracking eye positions together with head tilt and head rotations in freely moving mice [23]

  • The system includes two headmounted cameras combined with an inertial measurement unit (IMU) sensor (Figure 1A)

Read more

Summary

Introduction

Animals actively sample their sensory environment [1, 2]. For example, humans use a limited and highly structured set of head and eye movements (see [3] and references therein) to shift their gaze (eye in head + head in space) to selectively extract relevant information during visually guided behaviors, like making a cup of tea [4] or a peanut butter sandwich [5]. The mouse has emerged as a major model organism in vision research, due to the availability of genetic tools to dissect neural circuits and model human disease. This has yielded detailed insights into the circuitry and response properties of early visual pathways in mice (see [8] for a recent review). Mice use vision during natural behaviors, such as threat detection [9] and prey capture [10] They can be trained on standard visual paradigms similar to those used in humans and non-human primates, including visual detection and discrimination tasks, with or without head restraint [11,12,13]. The aim of our study was to determine how head and eye movements contribute to visually guided behaviors

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call