Abstract

To derive meaningful navigation strategies, animals have to estimate their directional headings in the environment. Accordingly, this function is achieved by the head direction cells that were found in mammalian brains, whose neural activities encode one’s heading direction. It is believed that such head direction information is generated by integrating self-motion cues, which also introduces accumulative errors in the long term. To eliminate such errors, this paper presents an efficient calibration model that mimics the animals’ behavior by exploiting visual cues in a biologically plausible way, and then implements it in robotic navigation tasks. The proposed calibration model allows the agent to associate its head direction and the perceived egocentric direction of a visual cue with its position and orientation, and therefore to calibrate the head direction when the same cue is viewed again. We examine the proposed head direction calibration model in extensive simulations and real-world experiments and demonstrate its excellent performance in terms of quick association of information to proximal or distal cues as well as accuracy of calibrating the integration errors of the head direction. Videos can be viewed at https://videoviewsite.wixsite.com/hdc-calibration.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call