Abstract

In this article, we design a memristive competitive neural network circuit based on the winner-take-all (WTA) mechanism and the online Hebbian learning rule. Each synapse of the network contains two memristors whose terminals of signal inputs are opposite. However, only one memristor participates in the calculation each time, and that one is determined by the original input signal. The competitive neural network circuit includes two parts: forward calculation and weight update. In this article, the forward calculation part of the circuit is designed based on the WTA mechanism. The combination of the leaky-integrate-and-fire (LIF) model and pMOS realizes the lateral inhibition of neurons. The design of the weight updating part is based on Hebbian learning rules. In each cycle, only synaptic memristors connected to the winner output neuron in forward calculation can be adjusted. The voltage used for synaptic memristor adjustment comes from the membrane voltage of the winner output neuron. The whole neural network circuit does not need the participation of a central processing unit (CPU) or a field-programmable gate array (FPGA) and really realizes parallel calculation, the saving of area, power consumption, and a certain extent computing-in-memory. Based on the circuit designed in PSPICE, we simulated the classification of <inline-formula xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> <tex-math notation="LaTeX">$5\times3$ </tex-math></inline-formula> pixel pictures. The changing trend of weights in the training phase and the high recognition accuracy in the recognition phase prove that the network can learn and recognize different patterns. The competitive neural network can be applied to the neuromorphic system of visual pattern recognition.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call