Abstract
Mutual information (MI) is being widely used to analyze the neural code in a variety of stochastic neuronal sensory systems. Unfortunately, MI is analytically tractable only for simple coding problems. One way to address this difficulty is to relate MI to Fisher information which is relatively easier to compute and interpret with regard to neurophysiological parameters. The relationship between the two measures is not always clear and often depends on the probability distribution function that best describes the distribution of the noise. Using Stam’s inequality we show here that deviations from Gaussianity in neuronal response distribution function can result in a large overestimation of MI, even in the small noise regime. This result is especially relevant when studying neural codes represented by Poissonian neurons.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
More From: Physica A: Statistical Mechanics and its Applications
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.