Abstract
Neuromorphic computing is considered to be a promising approach to developing general artificial intelligence. Like a human brain, the neuromorphic system uses neurons as the basic unit for information computing and storage. Neural networks have made great progress in some intelligent tasks by borrowing from the way the brain computes. Another important part of forming artificial intelligence is memory storage in neuromorphic systems. However, there is a lack of effective and practical methods for information storing in neuromorphic systems. In this paper, we propose a brain-like full-neuron memory (FNM) model to store memories in neuromorphic systems. We successfully stored ten numbers and letters in the FNM model. Memory objects are learned through Hebbian synaptic plasticity in an unsupervised manner and stored as neural activations in FNM. FNM works in a content-addressed way, which means the stored numbers and letters can be recalled by associative either visual or auditory stimuli. Memories are hierarchically structured in FNM. The numbers and letters can be recalled in full by incomplete input stimuli benefiting from the hierarchical structure. Some experiments are designed to verify the memory ability of the FNM model. With FNM model, the complete information processing and storage can be done in neuromorphic systems without taking the intermediate results out and processing or storing them on other computing platforms.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.