Abstract With the rapid advancement of artificial intelligence, the energy consumption bottleneck inherent in the von Neumann computing architecture poses a significant obstacle to the future development of edge computing, artificial intelligence, and information technology. Consequently, it is crucial to develop synaptic neural circuits that exhibit memory and learning properties through synaptic plasticity. Drawing inspiration from the side-gated graphene synaptic transistor, we have designed a synaptic neural circuit comprising four key components: pre-voltage input, synaptic weight modulation, electric double-layer effect, and post-membrane current response. Through comprehensive simulations, we have successfully mimicked various synaptic behaviours, including long-term and short-term synaptic plasticity, paired-pulse facilitation, spiking-rate-dependent plasticity, spiking time-dependent plasticity, and Pavlovian associative learning. This approach establishes a robust framework for designing synaptic neural network circuits with advanced learning capabilities, thereby enhancing the practical applications of neural networks and machine learning.
Read full abstract