Abstract

Spiking neural networks (SNNs) have been proposed both as models of cortical computation and as candidates for solving problems in machine learning. While increasing recent works have improved their performances in benchmark discriminative tasks, most of them learn by surrogates of backpropagation where biological features such as spikes are regarded more as defects than merits. In this thesis, we explore the enerative abilities of SNNs with built-in biological mechanisms. When sampling from high-dimensional multimodal distributions, models based on general Markov chain Monte Carlo methods often have the mixing problem that the sampler is easy to get trapped in local minima. Inspired from traditional annealing or tempering approaches, we demonstrate that increasing the rate of background Poisson noise in an SNN can flatten the energy landscape and facilitate mixing of the system. In addition, we show that with synaptic short-term plasticity (STP) the SNN can achieve more efficient mixing by local modulation of active attractors and eventually outperforming traditional benchmark models. We reveal diverse sampling statistics of SNNs induced by STP and finally study its implementation on conventional machine learning methods. Our work thereby highlights important computational consequences of biological features that might otherwise appear as artifacts of evolution.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call