Abstract

This paper is concerned with a critical evaluation of the measure of information and its relationship with entropy in physics elaborating on Georgescu-Roegen's critique. First we introduce Shannon's measure of information touching upon an historical development of the concept of information in communication engineering. Three points are emphasized: (1) the concept of information and the capacity of a communication channel should have been treated as separate concepts; (2) it is accidental that Shannon reached the function, H=−∑pilog2pi where ∑pi=1, through two different routes, i.e. an axiomatic treatment of information and a method of typical sequences; (3) Shannon misidentified a source of vernacular language with an ergodic stochastic Markov chain. Secondly, we present an analysis of Wiener's measure of information or uncertainty on a stochastic process. The main results are: (1) any measure of uncertainty, one of which is H, is not an ordinal variable, but a pseudo measure; (2) the amount of Wiener's information for all the continuous distribution becomes infinite; (3) the expected amount of Wiener's information for any absolutely continuous distribution depends only on the ordinal measure adopted. Thirdly, it is shown that the alleged equivalence between negative entropy and information is untenable by perusing the works of Szilard, Jaynes and Brillouin. Finally, Georgescu-Roegen's critique of the measure of information and of the alleged equivalence between negative entropy and information is briefly related to his interest in epistemology.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call