Abstract

The Information Theory (IT) of Fisher and Shannon provides convenient tools for the systematic and unbiased extraction of the chemical interpretation of the known (experimental or calculated) electron distribution in a molecule. A short overview of the basic concepts, relations, and techniques of IT is presented. The Shannon (S) entropy, reflecting the amount of the uncertainty (spread, disorder) contained in the given probability distribution, and the complementary Fisher (F) (intrinsic-accuracy) measure, focusing on the distribution narrowness (order), are introduced. The relative (“cross”) entropy (entropy deficiency, missing information, directed-divergence) concept of Kullback and Leibler (KL), probing the information distance between the compared probability distributions, is presented. Rudiments of the IT descriptors of the communication channels are outlined and applied to the illustrative symmetric binary channel (SBC). The average conditional-entropy (communication noise) and mutual-information (information flow) quantities of information networks are then discussed in a more detail in view of their importance for interpreting the covalent and ionic bond components within the “communication” theory of the chemical bond. The information characteristics or several dependent probability schemes are then briefly summarized and the variational principle for the constrained extremum of the adopted measure of information, called the extreme physical information (EPI) principle, is advocated as a powerful tool for an unbiased assimilation in the optimum probability distribution of the information contained in the relevant constraints and/or references.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call