Abstract

The limited data available about a macroscopic system may come in various forms: sharp constraints, expectation values, or control parameters. While these data impose constraints on the state, they do not specify it uniquely; a further principle—the maximum entropy principle—must be invoked to construct it. This chapter discusses basic notions of information theory and why entropy may be regarded as a measure of ignorance. It shows how the state—called a Gibbs state—is constructed using the maximum entropy principle, and elucidates its generic properties, which are conveniently summarized in a thermodynamic square. The chapter further discusses the second law and how it is linked to the reproducibility of macroscopic processes. It introduces the concepts of equilibrium and temperature, as well as pressure and chemical potential. Finally, this chapter considers statistical fluctuations of the energy and of other observables in case these are given as expectation values.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call