Abstract

Spiking neural networks (SNNs) are a computational tool in which the information is coded into spikes, as in some parts of the brain, differently from conventional neural networks (NNs) that compute over real-numbers. Therefore, SNNs can implement intelligent information extraction in real-time at the edge of data acquisition and correspond to a complementary solution to conventional NNs working for cloud-computing. Both NN classes face hardware constraints due to limited computing parallelism and separation of logic and memory. Emerging memory devices, like resistive switching memories, phase change memories, or memristive devices in general are strong candidates to remove these hurdles for NN applications. The well-established training procedures of conventional NNs helped in defining the desiderata for memristive device dynamics implementing synaptic units. The generally agreed requirements are a linear evolution of memristive conductance upon stimulation with train of identical pulses and a symmetric conductance change for conductance increase and decrease. Conversely, little work has been done to understand the main properties of memristive devices supporting efficient SNN operation. The reason lies in the lack of a background theory for their training. As a consequence, requirements for NNs have been taken as a reference to develop memristive devices for SNNs. In the present work, we show that, for efficient CMOS/memristive SNNs, the requirements for synaptic memristive dynamics are very different from the needs of a conventional NN. System-level simulations of a SNN trained to classify hand-written digit images through a spike timing dependent plasticity protocol are performed considering various linear and non-linear plausible synaptic memristive dynamics. We consider memristive dynamics bounded by artificial hard conductance values and limited by the natural dynamics evolution toward asymptotic values (soft-boundaries). We quantitatively analyze the impact of resolution and non-linearity properties of the synapses on the network training and classification performance. Finally, we demonstrate that the non-linear synapses with hard boundary values enable higher classification performance and realize the best trade-off between classification accuracy and required training time. With reference to the obtained results, we discuss how memristive devices with non-linear dynamics constitute a technologically convenient solution for the development of on-line SNN training.

Highlights

  • Spiking Neural Networks (SNNs) received a renewed wave of interest from a computational point of view as a tool to move the huge overload in data analysis from the cloud to the edge

  • As stated above, the aim of the present work is to test SNNs constituents and architectures that can be possibly realized in hybrid CMOS/memristor technology (Valentian et al, 2019; Regev et al, 2020)

  • We analyzed the impact of the synaptic weight dynamics on the performances of a two-layer fullyconnected SNN compatible with a hybrid CMOS/memristive implementation and trained through an unsupervised Spike Timing Dependent Plasticity (STDP) protocol

Read more

Summary

Introduction

Spiking Neural Networks (SNNs) received a renewed wave of interest from a computational point of view as a tool to move the huge overload in data analysis from the cloud to the edge. Hardware supporting SNNs are mainly based on research platforms In both cases, though, the lack of parallelism and separation between storage and computing units is still an issue, for which solutions are under investigation. Though, the lack of parallelism and separation between storage and computing units is still an issue, for which solutions are under investigation To this aim, emerging memory devices, compatible with backend of the production line of CMOS technology, and in particular resistive switching random access memories (RRAM), named memristive devices, are considered among the best candidates for hardware solutions supporting NNs and SNNs. In particular, the so-called neuromorphic systems intend to use memristive devices to update, during training, and store, for inference, the synaptic weights of a network

Objectives
Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call