Abstract

As neuroscience gradually uncovers how the brain represents and computes with high-level spatial information, the endeavor of constructing biologically-inspired robot controllers using these spatial representations has become viable. Grid cells are particularly interesting in this regard, as they are thought to provide a general coordinate system of space. Artificial neural network models of grid cells show the ability to perform path integration, but important for a robot is also the ability to calculate the direction from the current location, as indicated by the path integrator, to a remembered goal. This paper presents a neural system that integrates networks of path integrating grid cells with a grid cell decoding mechanism. The decoding mechanism detects differences between multi-scale grid cell representations of the present location and the goal, in order to calculate a goal-direction signal for the robot. The model successfully guides a simulated agent to its goal, showing promise for implementing the system on a real robot in the future.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.