Abstract

Hebbian plasticity, a mechanism believed to be the substrate of learning and memory, detects and further enhances correlated neural activity. Because this constitutes an unstable positive feedback loop, it requires additional homeostatic control. Computational work suggests that in recurrent networks, the homeostatic mechanisms observed in experiments are too slow to compensate instabilities arising from Hebbian plasticity and need to be complemented by rapid compensatory processes. We suggest presynaptic inhibition as a candidate that rapidly provides stability by compensating recurrent excitation induced by Hebbian changes. Presynaptic inhibition is mediated by presynaptic GABA receptors that effectively and reversibly attenuate transmitter release. Activation of these receptors can be triggered by excess network activity, hence providing a stabilising negative feedback loop that weakens recurrent interactions on sub-second timescales. We study the stabilising effect of presynaptic inhibition in recurrent networks, in which presynaptic inhibition is implemented as a multiplicative reduction of recurrent synaptic weights in response to increasing inhibitory activity. We show that networks with presynaptic inhibition display a gradual increase of firing rates with growing excitatory weights, in contrast to traditional excitatory-inhibitory networks. This alleviates the positive feedback loop between Hebbian plasticity and network activity and thereby allows homeostasis to act on timescales similar to those observed in experiments. Our results generalise to spiking networks with a biophysically more detailed implementation of the presynaptic inhibition mechanism. In conclusion, presynaptic inhibition provides a powerful compensatory mechanism that rapidly reduces effective recurrent interactions and thereby stabilises Hebbian learning.

Highlights

  • Synaptic plasticity is widely believed to be the neuronal substrate for learning and memory

  • The emerging positive feedback loop is believed to be counteracted by homeostatic mechanisms that aim to keep neural activity at a given set point

  • Using mathematical analyses and computer simulations, we show that presynaptic inhibition can compensate the strengthening of recurrent connections and stabilises neural networks subject to synaptic plasticity, even if homeostasis acts on biologically plausible timescales

Read more

Summary

Introduction

Synaptic plasticity is widely believed to be the neuronal substrate for learning and memory. The standard argument why this vicious circle does not generate runaway activity in the brain is that there is a broad spectrum of homeostatic mechanisms that keep this instability at bay [2,3,4] Such mechanisms have been demonstrated in various forms, including homeostatic scaling [5], intrinsic plasticity [6, 7], metaplasticity [8, 9] or plasticity of inhibition [10, 11]. While these mechanisms counteract modifications that take neuronal activity out of a functional regime, they all occur on a relatively long time scale of hours or days. Zenke et al [13] recently suggested that the known homeostatic mechanisms must be complemented by rapid compensatory processes that render the circuit stable on shorter time scales and give homeostasis the time it needs

Objectives
Methods
Results
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call