Abstract

As the efficiency of neuromorphic systems improves, biologically-inspired learning techniques are becoming more and more appealing for various computing applications, ranging from pattern and character recognition to general purpose reconfigurable logic. Due to their functional similarities to synapses in the brain, memristors are becoming a key element in the hardware realization of Hebbian Learning systems. By pairing such devices and a perceptron-based neuron model with a threshold activation function, previous work has shown that a neural logic block capable of learning any linearly separable function in real-time can be developed. However, in this configuration, any function with two or more decision boundaries cannot be learned in a single layer. While previous memristor-based neural logic block designs have proven to achieve very low area and high performance when compared to Look-Up Tables (LUT) and Capacitive Threshold Logic (CTL), the limitation on the set of learnable functions has made networks of these logic blocks impractical to scale to realistic applications. By integrating an additional layer of memristors into a neural logic block, this paper proposes a logic block with an adaptive activation function. The resulting logic block is capable of learning any function in a single layer, reducing the number of logic blocks required to implement a single 4-input function by up to 10 and significantly improving training time. When considered as a building block for ISCAS-85 benchmark circuits, the proposed logic block is capable of achieving an Energy-Delay Product (EDP) up to 97.8% lower than a neural logic block with a threshold activation function. Furthermore, the performance improvement over a CMOS LUT implementation ranges from 78.08% to 97.43% for all ISCAS-85 circuits.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call