Abstract

Before the cognitive map is generated through the fire of the rodent hippocampal spatial cells, mammals can obtain the outside knowledge through the visual information, which comes from the eyeball to the brain. The information is encoded and transferred to the two regions of the brain based on the fact of biophysiological research, which are known as “what” loop and “where” loop. In this article, we simulate an episodic memory recognition unit consisting of the integration of two-loop information, which is applied to building the accurate bioinspired spatial cognitive map of real environments. We employ the visual bag of word algorithm based on oriented Feature from Accelerated Segment Test and rotated Binary Robust Independent Elementary Features feature to build the “what” loop and the hippocampal spatial cells cognitive model, which comes from the front-end visual information input system to build the “where” loop. At the same time, the environmental cognitive map is a topological map containing information about place cell competition firing rate, oriented Feature from Accelerated Segment Test and rotated Binary Robust Independent Elementary Features feature descriptor, similarity of image retrieval, and relative location of cognitive map nodes. The simulation experiments and physical experiments in a mobile robot platform have been done to verify the environmental adaptability and robustness of the algorithm. This proposing algorithm would provide a foundation for further research on bioinspired navigation of robots.

Highlights

  • At present, mobile robot system has many achievements in environment simultaneous localization, mapping, and navigation based on the Bayesian probability algorithm, such as Kalman filter, particle filter, bundle adjustment, and graph optimization algorithm.[1,2,3] current technology is still far from producing a robot that can perform daily tasks in a complex environment

  • We present a bioinspired cognitive mapbuilding system based on episodic memory recognition

  • Inspired by the achievements of biophysiological research, we apply the motion information obtained from the frontend visual input system to neuron fire activities cells

Read more

Summary

Introduction

Mobile robot system has many achievements in environment simultaneous localization, mapping, and navigation based on the Bayesian probability algorithm, such as Kalman filter, particle filter, bundle adjustment, and graph optimization algorithm.[1,2,3] current technology is still far from producing a robot that can perform daily tasks in a complex environment These daily tasks require cognitive map and episodic memory. The advanced improved FAST detecting key point algorithm,[23] named oriented FAST and rotated Binary Robust Independent Elementary Features (BRIEF) (ORB), provides an independent rotation It can detect the key points at different scales. It has a good performance for the invariance of rotation, scales, illumination, and the lowcost computation

Related work
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call