Abstract

Insects exhibit incredibly robust closed loop flight dynamics in the face of uncertainties. A fundamental principle contributing to this unparalleled behavior is rapid processing and convergence of visual sensory information to flight motor commands via spatial wide-field integration, accomplished by retinal motion pattern sensitive interneurons (LPTCs) in the lobula plate portion of the visual ganglia. Models for spatially continuous retinal image flow and wide-field integration processing are presented, and within a control-theoretic framework the hover response of a planar rotorcraft is demonstrated utilizing static output feedback of wide-field integration signals. Hence, extraction of global retinal motion cues through computationally efficient wide-field integration processing provides a novel and promising methodology for utilizing visual sensory information in autonomous robotic navigation and flight control applications.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.