Abstract
Insects exhibit incredibly robust closed loop fight dynamics in the face of uncertainties. A fundamental principle contributing to this unparalleled behavior is rapid processing and convergence of visual sensory information to fight motor commands via spatial wide-field integration, accomplished by retinal motion pattern sensitive interneurons (LPTCs) in the lobula plate portion of the visual ganglia. With in a control- theoretic frame work, models for spatially continuous retinal image flow and wide-field integration processing are developed, establishing the connection between image flow kernels (retinal motion pattern sensitivities) and the feedback terms they represent. It is shown that these out puts are sufficient to stabilize speed regulation and terrain following tasks. Hence, extraction of global retinal motion cues through computationally efficient wide-field integration processing provides a novel and promising methodology for utilizing visual sensory information in autonomous robotic navigation and fight control applications.
Accepted Version (Free)
Published Version
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.