Abstract

Action potentials allow nervous systems to transmit information rapidly and efficiently over considerable distances, but what is the information they carry and how much can be carried by one neuron? Often, qualitative and vague descriptions are used, such as the firing rate representing intensity. Recent attempts to quantify information transmission by action potentials have concentrated on treating neurons as communication channels, whose information capacity can be estimated from their signal-to-noise ratios. However, this only indicates how much information could theoretically be carried, not the actual amount at any given time, and the ratio itself depends on assumptions about information coding. Here we introduce a different approach based on the concept of data compression, which has become familiar with the widespread use of digital computers and networks. Compression takes advantage of redundancy in a sequence of numbers to reduce its size, but allows it to be reconstructed later without error. We show that data compression by a context-free grammar can quantitatively estimate the real information content of action potential signals without any prior assumptions about coding, or knowledge of neural inputs. Measurements of information coding by mechanosensory neurons are used as examples, but a major advantage of this approach is its generality. It can estimate information transmission by any neuron whose output can be measured, regardless of neuronal type, connectivity or function.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call