Abstract

The mutual information between a stimulus signal and the spike count of a stochastic neuron is in many cases difficult to determine. Therefore, it is often approximated by a lower bound formula that involves linear correlations between input and output only. Here, we improve the linear lower bound for the mutual information by incorporating nonlinear correlations. For the special case of a Gaussian output variable with nonlinear signal dependencies of mean and variance we also derive an exact integral formula for the full mutual information. In our numerical analysis, we first compare the linear and nonlinear lower bounds and the exact integral formula for two different Gaussian models and show under which conditions the nonlinear lower bound provides a significant improvement to the linear approximation. We then inspect two neuron models, the leaky integrate-and-fire model with white Gaussian noise and the Na-K model with channel noise. We show that for certain firing regimes and for intermediate signal strengths the nonlinear lower bound can provide a substantial improvement compared to the linear lower bound. Our results demonstrate the importance of nonlinear input-output correlations for neural information transmission and provide a simple nonlinear approximation for the mutual information that can be applied to more complicated neuron models as well as to experimental data.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call