Abstract

In a remarkable series of papers beginning in 1957, E. T. Jaynes (1957) began a revolution in inductive thinking with his principle of maximum entropy. He defined probability as a degree of plausibility, a much more general and useful definition than the frequentist definition as the limit of the ratio of two frequencies in some imaginary experiment. He then used Shannon’s definition of entropy and stated that in any situation in which we have incomplete information, the probability assignment which expresses all known information and is maximally non-committal with respect to all unknown information is that unique probability distribution with maximum entropy (ME). It is also a combinatorial theorem that the unique ME probability distribution is the one which can be realized in the greatest number of ways. The ME principle also provides the fairest description of our state of knowledge. When further information is obtained, if that information is pertinent then a new ME calculation can be performed with a consequent reduction in entropy and an increase in our total information. It must be emphasized that the ME solution is not necessarily the “correct” solution; it is simply the best that can be done with whatever data are available. There is no one “correct solution”, but an infinity of possible solutions. These ideas will now be made quite concrete and expressed mathematically.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.