Abstract

Since its inception, the concept of entropy has been known under a variety of guises and been used in an increasing number of contexts, achieving an almost rock star-like status in both the sciences and popular culture. The three most prominent “styles” which entropy has been (re)told in and which have determined its popularity are the thermodynamic, statistical and information-theoretic one, owing much to the work of Clausius, Boltzmann and Gibbs, and Shannon, respectively. In the relentless hunt for the core of the concept that spurred this development, connections with irreversibility and emergence of time, nature of probability and information emerged adding to its elusiveness as much as stimulating its proliferation and cross-contextual adoption. In this historical review, we retrace, through primary and secondary sources, the three main perspectives from which entropy has been regarded, emphasising the motivations behind each new version, their ramifications and the bridges that have been constructed to justify them. From this analysis of the foundations a number of characteristic traits of the concept emerge that underline its exceptionality as an engine of conceptual progress.

Highlights

  • What is entropy? If the questioner’s purpose were to suddenly embarrass a physicist for a moment, that enquire would certainly achieve its aim—not because there is no answer, but because there are too many: a simple selection from the known historical sources, limited for example to thermodynamics and statistical mechanics, presents us already with several different, seemingly unrelated and not unambiguous definitions

  • After a long gestation (1854–1865), Clausius is able to restate the second law in an elegant and compact fashion in its volume on the mechanical theory of heat [2]: If for the entire universe we conceive the same magnitude to be determined [..] which for a single body I have called entropy, and if at the same time we introduce the other and simpler conception of energy, we may express in the following manner the fundamental laws of the universe which correspond to the two fundamental theorems of the mechanics of heat

  • The introduction of these probabilistic aspects—the Maxwell distribution, the demon and the statistical certainty of the second law—as properties of the physical description rather than of the physics itself constitutes the beginning of the realisation of what a thermodynamic description is and to what degree it differentiates from the standard mechanical description

Read more

Summary

Introduction

What is entropy? If the questioner’s purpose were to suddenly embarrass a physicist for a moment, that enquire would certainly achieve its aim—not because there is no answer, but because there are too many: a simple selection from the known historical sources, limited for example to thermodynamics and statistical mechanics, presents us already with several different, seemingly unrelated and not unambiguous definitions. Against the background of the questions posed, this review wishes to map out how and motivated by what the thermodynamic concept of entropy historically ramified into the two other parallel branches—the statistic and the informational—which in turn have formed the base of many of the different and manifold analogues that are presently in circulation The aim of such an analysis is not to judge in hindsight how legitimate the conceptual ramifications were and are (for which authoritative literature exists), but rather to understand, at least partly, the reasons why the concept of entropy has stimulated the sustained conceptual progress that it did, in its own field as well as in a variety of other fields. We look over the historical development of the concept from its introduction as far as the early 1960s, emphasising at each turning point the aspects of it that, at least in the eyes of some, called for further progress or suggested the possibility of an extension of its meaning

Clausius’ Coinage
Entropy in the Cosmogonic Context
Entropy in Statistics Style
Boltzmann’s Versions
Gibbs’ Version
Entropy in Informational Style
Resurrecting Maxwell Demon
Literal Interpretation of Szilard Entropy Cost of Information
Entropy According to Shannon
Information-Theoretic Interpretation of Physical Entropy
The Concept of Relative Entropy
Other Fruits of the Entropy Tree
Summary and Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call