Abstract

Insects are able to freely navigate through ever-changing environments using predominantly visual inputs, while possessing very minimal processing power compared to humans. Not only are they able to move at high velocities and accelerations, but they are also able to achieve extraordinary levels of obstacle avoidance. We begin to emulate this biological behaviour in a robotic application by first modelling how these visual pathways react to separable degrees of freedom within the motion field, specifically rotation in this case. We have developed upon an existing biologically-inspired algorithm based on the visual pathway of the hoverfly, and statistically compare results to current state-of-the-art algorithms, all the while performing this on computationally-constrained embedded hardware. We have shown that, using a complex, highly-elaborated representation of the hoverfly visual pathway, rotation optical flow estimations can be achieved with a high level of accuracy, at a level of consistency previously unseen in dense-flow algorithms, and they can be achieved at 100 frames per second on an embedded system. This work forms a fundamental basis to understanding one of the two separable components of insect egomotion (rotation and translation), allowing for consistently accurate rotational velocity estimation, providing a building block towards understanding the translational component of insect vision and the application of biologically-inspired egomotion estimation in autonomous vehicles.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call