Abstract

In this study, we present a highly configurable neuromorphic computing substrate and use it for emulating several types of neural networks. At the heart of this system lies a mixed-signal chip, with analog implementations of neurons and synapses and digital transmission of action potentials. Major advantages of this emulation device, which has been explicitly designed as a universal neural network emulator, are its inherent parallelism and high acceleration factor compared to conventional computers. Its configurability allows the realization of almost arbitrary network topologies and the use of widely varied neuronal and synaptic parameters. Fixed-pattern noise inherent to analog circuitry is reduced by calibration routines. An integrated development environment allows neuroscientists to operate the device without any prior knowledge of neuromorphic circuit design. As a showcase for the capabilities of the system, we describe the successful emulation of six different neural networks which cover a broad spectrum of both structure and functionality.

Highlights

  • By nature, computational neuroscience has a high demand for powerful and efficient devices for simulating neural network models

  • We have successfully implemented a variety of neural microcircuits on a single universal neuromorphic substrate, which is described in detail by Schemmel et al (2006)

  • All networks show activity patterns qualitatively and to some extent quantitatively similar to those obtained by software simulations

Read more

Summary

INTRODUCTION

Computational neuroscience has a high demand for powerful and efficient devices for simulating neural network models. The arguably most characteristic feature of neuromorphic devices is inherent parallelism enabled by the fact that individual neural network components (essentially neurons and synapses) are physically implemented in silico Due to this parallelism, scaling of emulated network models does not imply slowdown, as is usually the case for conventional machines. Configurability costs can be counterbalanced by decreasing precision This could concern the size of integration time steps (Imam et al, 2012a), the granularity of particular parameters (Pfeil et al, 2012), or fixed-pattern noise affecting various network components. At least the latter can be, to some extent, moderated through elaborate calibration methods (Neftci and Indiveri, 2010; Brüderle et al, 2011; Gao et al, 2012). We show six more networks emulated on our hardware system, each requiring its own hardware configuration in terms of network topology and neuronal as well as synaptic parameters

THE NEUROMORPHIC SYSTEM
HARDWARE EMULATION OF NEURAL NETWORKS
Findings
DISCUSSION
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.