Abstract
Self-motion perception is a vital skill for all species. It is an inherently multisensory process that combines inertial (body-based) and relative (with respect to the environment) motion cues. Although extensively studied in human and non-human primates, there is currently no paradigm to test self-motion perception in rodents using both inertial and relative self-motion cues. We developed a novel rodent motion simulator using two synchronized robotic arms to generate inertial, relative, or combined (inertial and relative) cues of self-motion. Eight rats were trained to perform a task of heading discrimination, similar to the popular primate paradigm. Strikingly, the rats relied heavily on airflow for relative self-motion perception, with little contribution from the (limited) optic flow cues provided-performance in the dark was almost as good. Relative self-motion (airflow) was perceived with greater reliability vs. inertial. Disrupting airflow, using a fan or windshield, damaged relative, but not inertial, self-motion perception. However, whiskers were not needed for this function. Lastly, the rats integrated relative and inertial self-motion cues in a reliability-based (Bayesian-like) manner. These results implicate airflow as an important cue for self-motion perception in rats and provide a new domain to investigate the neural bases of self-motion perception and multisensory processing in awake behaving rodents.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.