Abstract

Informational entropy is quantitatively related to similarity and symmetry. Some tacit assumptions regarding their correlation have been shown to be wrong. The Gibbs paradox statement (indistinguishability corresponds to minimum entropy, which is zero) has been rejected. All their correlations are based on the relation that less information content corresponds to more entropy. Higher value of entropy is correlated to higher molecular similarity. The maximum entropy of any system (e.g., a mixture or an assemblage) corresponds to indistinguishability (total loss of information), to perfect symmetry or highest symmetry, and to the highest simplicity. This conforms without exception to all the experimental facts of both dynamic systems and static structures and the related information loss processes.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call