Abstract

This paper proposes a framework called Episodic memory-driving Markov decision processes (EM-MDPs) for incremental self-learning of robotic experience and cognitive behavior control under uncertainty. The framework simulates the organization process of episodic memory by introducing the neuron stimulation mechanism. Firstly, episode model is built, and the activation and stimulation mechanism of state neurons is proposed based on cognitive neuroscience. Secondly, episodic self-learning is also proposed by utilizing sparse distributed memory (SDM) through Hebbian rules, to realize memory real-time storage, incremental accumulation and integration. Finally, a robotic cognitive behavior control approach is established. Neuron synaptic potential is introduced for event localization. Robot can evaluate the past events sequence, predict the current state and plan the desired behavior. Two main challenges in robot behavior control under uncertainty are addressed in the paper: high computational complexity and perceptual aliasing. The proposed system is evaluated in several real life environments for mobile robot. The applicability and the usefulness of the developed method are validated by the results obtained.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call