Abstract

Predictive Statistical Mechanics ~PSM! was coined by Edwin T. Jaynes ~1922–1998! who was an international leader in fundamentals of theoretical physics. The book is very timely to researchers in hydraulic engineering and possibly to researchers in other areas of civil engineering. These civil engineering researchers may not be familiar with the approach used by PSM despite the fact that it was established over half a century ago and reported to have had successful applications in many scientific fields. These fields include information theory, physics, astronomy, and even in economics and regional and urban planning. To potential readers in hydraulic engineering, the most informative part of the book, also the easiest to understand, would be the material presented in the Preface, Prolegomena, Introduction, and Final Remarks ~Chapter 8!. These four separate sections of the book give an in-depth description of the philosophical and mathematical foundations of PSM. To bring up some of the key points, PSM is based on the Bayesian approach in probability theory that is what Philip W. Anderson, winner of the 1977 Nobel Prize in Physics, and other scientists consider ‘‘the most appropriate probability theory in sciences.’’ The key question that arises in applying the approach is ‘‘How shall we use probability theory to help us do plausible reasoning in situations where, because of incomplete information we cannot use deductive reasoning?’’ This means that the main task in using the approach is to derive the probability distribution based on the incomplete information available without making unwarranted hypotheses or assumptions. Jaynes proposed to derive the most probable or the least biased probability distribution by applying the variational principle of maximizing the Shannon’s information entropy ~Shannon 1948! subject to constraints imposed by the available information. To give a little background about ‘‘entropy,’’ Jaynes ~1982! said By far the most abused word in science is ‘‘entropy.’’ Confusion over the different meanings of this word, already serious 35 years ago, reached disaster proportions with the 1948 advent of Shannon’s information theory, which not only appropriated the same word for a new set of meanings; but even worse, proved to be highly relevant to statistical mechanics. ‘‘Information entropy’’ was coined by Claude E. Shannon ~1916–2001!, the father of information theory, on advice from Johannes von Neumann. Some, in addition to Jaynes, seemed to feel that this was unfortunate since it has much broader meaning and potential applications than the thermodynamic entropy. While the thermodynamic entropy is limited to closed systems in equilibrium, the concept of information entropy is not. To avoid the confusion, it would have been better to preserve the term ‘‘en-

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.