Abstract

During development, biological neural networks produce more synapses and neurons than needed. Many of these synapses and neurons are later removed in a process known as neural pruning. Why networks should initially be over-populated, and the processes that determine which synapses and neurons are ultimately pruned, remains unclear. We study the mechanisms and significance of neural pruning in model neural networks. In a deep Boltzmann machine model of sensory encoding, we find that (1) synaptic pruning is necessary to learn efficient network architectures that retain computationally-relevant connections, (2) pruning by synaptic weight alone does not optimize network size and (3) pruning based on a locally-available measure of importance based on Fisher information allows the network to identify structurally important vs. unimportant connections and neurons. This locally-available measure of importance has a biological interpretation in terms of the correlations between presynaptic and postsynaptic neurons, and implies an efficient activity-driven pruning rule. Overall, we show how local activity-dependent synaptic pruning can solve the global problem of optimizing a network architecture. We relate these findings to biology as follows: (I) Synaptic over-production is necessary for activity-dependent connectivity optimization. (II) In networks that have more neurons than needed, cells compete for activity, and only the most important and selective neurons are retained. (III) Cells may also be pruned due to a loss of synapses on their axons. This occurs when the information they convey is not relevant to the target population.

Highlights

  • The number of neurons and synapses initially formed during brain development far exceeds those in the mature brain [1]

  • In a deep Boltzmann machine model of sensory encoding, we find that (1) synaptic pruning is necessary to learn efficient network architectures that retain computationally-relevant connections, (2) pruning by synaptic weight alone does not optimize network size and (3) pruning based on a locally-available measure of importance based on Fisher information allows the network to identify structurally important vs. unimportant connections and neurons

  • By ‘relevant network topology’ we mean a topology optimized for computational needs that includes only neurons and synapses that are relevant for the task at hand. In our experiments this task is the encoding of visual stimuli in hidden representations of Restricted Boltzmann Machines (RBMs) and deep Boltzmann machine (DBM)

Read more

Summary

Introduction

The number of neurons and synapses initially formed during brain development far exceeds those in the mature brain [1]. Up to half of the cells and connections are lost due to pruning [2,3,4,5,6,7]. This process of initial over-growth and subsequent reduction suggests that the optimal wiring of the brain is not entirely predetermined. The removal of unneeded neurons and synapses reduces the high costs of the brain in terms of material and metabolism [9]. There seems to be a sweet spot between pruning and over-pruning for the brain to remain adaptive and resilient against damage

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call