Abstract
Cortical neurons are silent most of the time: sparse activity enables low-energy computation in the brain, and promises to do the same in neuromorphic hardware. Beyond power efficiency, sparse codes have favourable properties for associative learning, as they can store more information than local codes but are easier to read out than dense codes. Auto-encoders with a sparse constraint can learn sparse codes, and so can single-layer networks that combine recurrent inhibition with unsupervised Hebbian learning. But the latter usually require fast homeostatic plasticity, which could lead to catastrophic forgetting in embodied agents that learn continuously. Here we set out to explore whether plasticity at recurrent inhibitory synapses could take up that role instead, regulating both the population sparseness and the firing rates of individual neurons. We put the idea to the test in a network that employs compartmentalised inputs to solve the task: rate-based dendritic compartments integrate the feedforward input, while spiking integrate-and-fire somas compete through recurrent inhibition. A somato-dendritic learning rule allows somatic inhibition to modulate nonlinear Hebbian learning in the dendrites. Trained on MNIST digits and natural images, the network discovers independent components that form a sparse encoding of the input and support linear decoding. These findings confirm that intrinsic homeostatic plasticity is not strictly required for regulating sparseness: inhibitory synaptic plasticity can have the same effect. Our work illustrates the usefulness of compartmentalised inputs, and makes the case for moving beyond point neuron models in artificial spiking neural networks.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Similar Papers
More From: Neural Networks
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.