Abstract

Spiking neural networks, the most realistic artificial representation of biological nervous systems, are promising due to their inherent local training rules that enable low-overhead online learning, and energy-efficient information encoding. Their downside is more demanding functionality of the artificial synapses, notably including spike-timing-dependent plasticity, which makes their compact efficient hardware implementation challenging with conventional device technologies. Recent work showed that memristors are excellent candidates for artificial synapses, although reports of even simple neuromorphic systems are still very rare. In this study, we experimentally demonstrate coincidence detection using a spiking neural network, implemented with passively integrated metal-oxide memristive synapses connected to an analogue leaky-integrate-and-fire silicon neuron. By employing spike-timing-dependent plasticity learning, the network is able to robustly detect the coincidence by selectively increasing the synaptic efficacies corresponding to the synchronized inputs. Not surprisingly, our results indicate that device-to-device variation is the main challenge towards realization of more complex spiking networks.

Highlights

  • Spiking neural networks, the most realistic artificial representation of biological nervous systems, are promising due to their inherent local training rules that enable low-overhead online learning, and energy-efficient information encoding

  • The resting potential is used as a reference for U(t). Another important SNN feature is spike-timing-dependent plasticity (STDP), which is a timing-dependent specialization of Hebbian learning[1,2,5,6]

  • A typical goal of STDP learning is to strengthen the synaptic efficiency when two events happen in the expected causal temporal order, and to weaken it otherwise

Read more

Summary

Introduction

The most realistic artificial representation of biological nervous systems, are promising due to their inherent local training rules that enable low-overhead online learning, and energy-efficient information encoding. Their downside is more demanding functionality of the artificial synapses, notably including spike-timing-dependent plasticity, which makes their compact efficient hardware implementation challenging with conventional device technologies. Most popular SNNs’ weight updates rules are local, requiring only information from pre- and post-synaptic neurons (e.g., see Eq 2 below), which could be a significant advantage for compact and low power implementations of real-time training, and scaling towards more complex networks[4]. STDP rule formally describes the change in synaptic weight as a specific function fSTDP of a difference in firing times between presynaptic (tpre) and post-synaptic (tpost) spikes, i.e., ΔG 1⁄4 fSTDPðtpre À tpostÞ: ð2Þ

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call