Abstract

Neuronal information processing is energetically costly. Energy supply restrictions on information processing have caused brains to evolve to compute and communicate information with remarkable efficiency. Indeed, energy minimization subject to functional constraints is a unifying principle. Toward better comprehension of neuronal information processing and communication from an information-energy standpoint, we consider a neuron model with a generalized inverse Gaussian (GIG) conditional density. This GIG model arises from a Levy diffusion process that is more general than that of a Wiener process with drift. We show that, when the GIG neuron operates so as to maximize bits per Joule (bpJ), the output interspike interval (ISI) distribution is a related GIG marginal distribution. Also, we specify how to obtain the associated input distribution fΛ(λ) numerically. By generalizing from the Gamma and inverse Gaussian (IG) families to the GIG family, the derived results contain both the homogeneous Poisson and Wiener processes as special cases. The results allow us to readily compute the tradeoff between information rate (bits/second) and average power (Joules/second) in the GIG class, reminiscent of Shannon's celebrated formula for such curves for the additive Gaussian family.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call