Abstract

Thermodynamic modeling of extensive systems usually implicitly assumes the additivity of entropy. Furthermore, if this modeling is based on the concept of Shannon entropy, additivity of the latter function must also be guaranteed. In this case, the constituents of a thermodynamic system are treated as subsystems of a compound system, and the Shannon entropy of the compound system must be subjected to constrained maximization. The scope of this paper is to clarify prerequisites for applying the concept of Shannon entropy and the maximum entropy principle to thermodynamic modeling of extensive systems. This is accomplished by investigating how the constraints of the compound system have to depend on mean values of the subsystems in order to ensure additivity. Two examples illustrate the basic ideas behind this approach, comprising the ideal gas model and condensed phase lattice systems as limiting cases of fluid phases. The paper is the first step towards developing a new approach for modeling interacting systems using the concept of Shannon entropy.

Highlights

  • In his basic work, Shannon [1] defines a function H which measures the amount of information of a system which can reside in either of m possible states by means of the probabilities pi of the states: H(p1, . . . , pm ) = −K m X pi log pi (1) i=1Both, the constant K and the basis of the logarithm are arbitrary as they just account for a scaling of H.The set of all pi can be written as probability distribution p:p = {p1, . . . , pm }, with the normalization condition m pi = 1 (2)In the following we set K = 1 and choose the natural logarithm

  • We consider a compound system composed of N subsystems, each characterized by its individual probability distribution:

  • When deriving the Shannon entropy of a compound system by utilizing homogeneity, Equation (8), we made the following preassumptions: The first is the assumption of statistically independent subsystems, which may be plausible for many thermodynamic systems as long as the subsystems are ‘not too strongly correlated in some nonlocal sense’ [10]

Read more

Summary

Introduction

Shannon [1] defines a function H which measures the amount of information of a system which can reside in either of m possible states by means of the probabilities pi of the states: H(p1 , . The set of all pi can be written as probability distribution p:. M) can be omitted and H can formally be written as function of the probability distribution: pi ln pi (3). Previous papers [1,2,3,4,5,6,7,8] worked out that this measure has all the properties of thermodynamic entropy as introduced by statistical physics. The maximum value is given for uniformly distributed probabilities [6]:

Compound Systems
The Bridge to Thermodynamic Entropy
Additivity of Shannon Entropy
Maximization of Unconstrained Systems
Constrained Maximization of a Single System
Constrained Maximization of a Cmpound System
Systems Underlying Several Constraints
Application to Thermodynamic Modeling of Fluid Phases
Ideal Gas
Condensed Phase Lattice Systems
The concept of subsystems applied to a lattice system
The unconstrained system
System considering constraints
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call