Abstract

Animals benefit from knowing if and how they are moving. Across the animal kingdom, sensory information in the form of optic flow over the visual field is used to estimate self-motion. However, different species exhibit strong spatial biases in how they use optic flow. Here, we show computationally that noisy natural environments favor visual systems that extract spatially biased samples of optic flow when estimating self-motion. The performance associated with these biases, however, depends on interactions between the environment and the animal's brain and behavior. Using the larval zebrafish as a model, we recorded natural optic flow associated with swimming trajectories in the animal's habitat with an omnidirectional camera mounted on a mechanical arm. An analysis of these flow fields suggests that lateral regions of the lower visual field are most informative about swimming speed. This pattern is consistent with the recent findings that zebrafish optomotor responses are preferentially driven by optic flow in the lateral lower visual field, which we extend with behavioral results from a high-resolution spherical arena. Spatial biases in optic-flow sampling are likely pervasive because they are an effective strategy for determining self-motion in noisy natural environments.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.