The concept of temperature is one of the key ideas in describing the thermodynamical properties of systems. In classical statistical mechanics of ideal gases, the notion of temperature can be described in at least two different ways: the kinetic temperature (related to the average kinetic energy of the particles) and the thermodynamic temperature (related to the ratio between infinitesimal changes in entropy and energy). For the Boltzmann distribution, the two notions lead to the same result. However, for nonequilibrium phenomena, while the kinetic temperature has been commonly used both for theoretical and simulation purposes, there appears to be no corresponding general definition of thermodynamic or entropic temperature. In this paper, we consider the statistical or Shannon entropy of a system and use the "de Bruijn identity" from information theory (see Appendix A 2 for a derivation of this identity) to show that it is possible to define a "Shannon temperature" or "entropic temperature" T for a nonequilibrium system as the ratio between the average curvature of the Hamiltonian function associated with the system and the trace of the Fisher information matrix of the nonequilibrium probability distribution (see Appendix A 1 for a definition of the Fisher information). We show that this definition subsumes many other attempts at defining entropic temperatures for nonequilibrium systems and is not restricted to equilibrium or near equilibrium systems. Intuitively, the gist of our approach is to use the Shannon or Gibbs entropy of a system and make use of the relation dS=dQ(rev)/T as a definition of temperature. We achieve this by positing a statistical notion of infinitesimal heating as the addition of uncorrelated random variables (in a special way). As an example of the utility of such a definition, we obtain the nonequilibrium entropic temperature for a system satisfying the Langevin equations. For such a system, we show that while the kinetic temperature is related to the changes in the energy of the system, the entropic or Shannon temperature is related to the changes in the entropy of the system. We show that this notion, together with the well known Cramer-Rao inequality in statistics demonstrates the validity of the second law of thermodynamics for such a nonequilibrium system.
Read full abstract