Abstract

Robust and efficient target-tracking algorithms embedded on moving platforms, are a requirement for many computer vision and robotic applications. However, deployment of a real-time system is challenging, even with the computational power of modern hardware. As inspiration, we look to biological lightweight solutions–lightweight and low-powered flying insects. For example, dragonflies pursue prey and mates within cluttered, natural environments, deftly selecting their target amidst swarms. In our laboratory, we study the physiology and morphology of dragonfly ‘small target motion detector’ neurons likely to underlie this pursuit behaviour. Here we describe our insect-inspired tracking model derived from these data and compare its efficacy and efficiency with state-of-the-art engineering models. For model inputs, we use both publicly available video sequences, as well as our own task-specific dataset (small targets embedded within natural scenes). In the context of the tracking problem, we describe differences in object statistics within the video sequences. For the general dataset, our model often locks on to small components of larger objects, tracking these moving features. When input imagery includes small moving targets, for which our highly nonlinear filtering is matched, the robustness outperforms state-of-the-art trackers. In all scenarios, our insect-inspired tracker runs at least twice the speed of the comparison algorithms.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call