Abstract

A recent model of intrinsic plasticity coupled to Hebbian synaptic plasticity proposes that adaptation of a neuron's threshold and gain in a sigmoidal response function to achieve a sparse, exponential output firing rate distribution facilitates the discovery of heavy-tailed or super- gaussian sources in the neuron's inputs. We show that the exponential output distribution is irrelevant to these dynamics and that, furthermore, while sparseness is sufficient, it is not necessary. The intrinsic plasticity mechanism drives the neuron's threshold large and positive, and we prove that in such a regime, the neuron will find supergaussian sources; equally, however, if the threshold is large and negative (an antisparse regime), it will also find supergaussian sources. Away from such extremes, the neuron can also discover subgaussian sources. By examining a neuron with a fixed sigmoidal nonlinearity and considering the synaptic strength fixed-point structure in the two-dimensional parameter space defined by the neuron's threshold and gain, we show that this space is carved up into sub- and supergaussian-input-finding regimes, possibly with regimes of simultaneous stability of sub- and supergaussian sources or regimes of instability of all sources; a single gaussian source may also be stabilized by the presence of a nongaussian source. A neuron's operating point (essentially its threshold and gain coupled with its input statistics) therefore critically determines its computational repertoire. Intrinsic plasticity mechanisms induce trajectories in this parameter space but do not fundamentally modify it. Unless the trajectories cross critical boundaries in this space, intrinsic plasticity is irrelevant and the neuron's nonlinearity may be frozen with identical receptive field refinement dynamics.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call