Abstract

In this paper we propose a model of visually guided route navigation in ants that captures the known properties of real behaviour whilst retaining mechanistic simplicity and thus biological plausibility. For an ant, the coupling of movement and viewing direction means that a familiar view specifies a familiar direction of movement. Since the views experienced along a habitual route will be more familiar, route navigation can be re-cast as a search for familiar views. This search can be performed with a simple scanning routine, a behaviour that ants have been observed to perform. We test this proposed route navigation strategy in simulation, by learning a series of routes through visually cluttered environments consisting of objects that are only distinguishable as silhouettes against the sky. In the first instance we determine view familiarity by exhaustive comparison with the set of views experienced during training. In further experiments we train an artificial neural network to perform familiarity discrimination using the training views. Our results indicate that, not only is the approach successful, but also that the routes that are learnt show many of the characteristics of the routes of desert ants. As such, we believe the model represents the only detailed and complete model of insect route guidance to date. What is more, the model provides a general demonstration that visually guided routes can be produced with parsimonious mechanisms that do not specify when or what to learn, nor separate routes into sequences of waypoints.

Highlights

  • The impressive ability of social insects to learn long foraging routes guided by visual information [1,2,3,4,5,6,7,8] provides proof that robust spatial behaviour can be produced with limited neural resources [9,10,11]

  • We have developed a parsimonious model of route navigation that captures many of the known properties of ants routes

  • The minimum across all stored images and all viewing directions experienced during a 360o scan of the world from the current location is deemed the most familiar view for that location and a 10 cm step is taken in the viewing direction associated with this minimum

Read more

Summary

Introduction

The impressive ability of social insects to learn long foraging routes guided by visual information [1,2,3,4,5,6,7,8] provides proof that robust spatial behaviour can be produced with limited neural resources [9,10,11]. This is a goal shared by biomimetic engineers and those studying animal cognition using a bottom-up approach to the understanding of natural intelligence [13] In this field, computational models have proved useful as proof of concept [14,15] that a particular sensori-motor strategy [16] or memory organisation [17] can account for observed behaviour. Computational models have proved useful as proof of concept [14,15] that a particular sensori-motor strategy [16] or memory organisation [17] can account for observed behaviour Such models of visual navigation that have been successful in replicating place homing are dominated by snapshot-type models; where a single view of the world as memorized from the goal location is compared to the current view in order to drive a search for the goal [16,18,19,20,21,22,23,24,25,26]. The algorithm exhibits both place-search and route navigation with the same mechanism

Methods
Results
Discussion
Conclusion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.