We study a learning rule based upon the temporal correlation (weighted by a learning kernel) between incoming spikes and the internal state of the postsynaptic neuron, building upon previous studies of spike timing dependent synaptic plasticity (Kempter, R., Gerstner, W., van Hemmen, J.L., Wagner, H., 1998. Extracting Oscillations: Neuronal coincidence detection with noisy periodic spike input. Neural computation 10, 1987–2017; Kempter, R., Gerstner, W., van Hemmen, J.L., 1999. Hebbian learning and spiking neurons. Physical Reviewm E59, 4498–4514; van Hemmen, J.L., 2001. Theory of synaptic plasticity. In: Moss, F., Gielen, S. (Eds.), Handbook of biological physics. vol. 4, Neuro Informatics, neural modelling, Elsevier, Amsterdam, pp. 771–823. Our learning rule for the synaptic weight w ij is w ˙ ij ( t ) = ∈ ∫ - ∞ ∞ 1 T l ∫ t - T l t ∑ μ δ ( τ + s - t j , μ ) u ( τ ) d τ Γ ( s ) d s , where the t j,μ are the arrival times of spikes from the presynaptic neuron j and the function u( t) describes the state of the postsynaptic neuron i. Thus, the spike-triggered average contained in the inner integral is weighted by a kernel Γ( s), the learning window, positive for negative, negative for positive values of the time difference s between post- and presynaptic activity. An antisymmetry assumption for the learning window enables us to derive analytical expressions for a general class of neuron models and to study the changes in input–output relationships following from synaptic weight changes. This is a genuinely non-linear effect (Song, S., Miller, K., Abbott, L., 2000. Competitive Hebbian learning through spike-timing dependent synaptic plasticity. Nature Neuroscience 3, 919–926).
Read full abstract