Abstract

The information transmission among the elements of a small network of spiking neurons is studied using the normalized differential entropy and the Kullback Leibler distance information measures. The attention is devoted to the information content of the spiking activity of a reference neuron subject to excitatory and inhibitory stimuli coming from other elements of the considered network. The use of information measures allows to quantify the effects of the input contributions. The role of inhibition in the spiking activity of the reference neuron is enlighted and the effect of considering different distributions for the time events of the stimuli is discussed. 1 Introduction The classical characterization of the input-output properties of neurons, as well as of the neuronal models, makes use of the so called frequency transfer functions. Several definitions exist for these functions but they all share the feature of being plots of the output frequency of firing against the strength of the input signal. The differences are confined to the measure employed to quantify the input strenght. Use of the transfer function tacitely assumes that the information is coded by the frequency of action potentials ((2)). However, frequency is only one possible descriptor of the properties of a spike train and it is evident that different spike records may have the same frequency but at the same time they may be very dissimilar. Therefore, other tools to quantify the differences among spike records have been investigated. Considering the well established relevance of noise in neuronal coding (cf. for example (3, 15, 21, 19)), alternative measures that could be able to catch the random properties of the firing activity have been introduced. In this direction many efforts have been made to apply Shannon theory of communi- cation and information transmission ((20)) to the study of the properties of the nervous system. Among the wide literature on this topic, we cite here for example (11), where the Fisher information is used to study the input-output effect for some neuronal models. In (12) and (13) this measure is applied to the stochastic leaky integrate-and-fire (LIF) model, iden- tifying the input signal as a constant additive term in the drift of the Ornstein-Uhlenbeck process used to describe the underthreshold membrane potential dynamics. In (9) a nor- malized version of the differential entropy is introduced to study the randomness for the same neuronal model. Information theory is applied here to the study of the information transmission among the units of a small network of spiking neurons. In particular we are interested in the quantification of the response of a reference neuron to excitatory and inhibitory inputs coming from other units of the network. The problem has been already considered in (22) and (23) but the study is performed there with different tools of analysis, as histograms, autocorrelation functions and crosscorrelation functions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call