Abstract

Neural associative memory (AM) is one of the critical building blocks for cognitive computing systems. It memorizes (learns) and retrieves input data by information content itself. One of the key challenges of designing AM for intelligent devices is to expand memory capacity while using a minimal amount of hardware and energy resources. However, prior arts show that memory capacity increases slowly, i.e., in square root with the total number of synaptic weights. To tackle this problem, we propose a synapse model called recursive synaptic bit reuse, which enables near-linear scaling of memory capacity with total synaptic bits. Our model can also handle input data that are correlated more robustly than the conventional model. We evaluated our model in the context of Hopfield neural networks (HNNs) that contain 5–327-KB data storage for synaptic weights. Our model can increase the memory capacity of HNNs as large as $30\times $ over the conventional ones. The very large scale integration implementation of HNNs in 65 nm confirms that our proposed model can save up to $19\times $ area and up to $232\times $ energy dissipation as compared to the conventional model. These savings are expected to grow with the network size.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.