Abstract

The study of dense gases and liquids requires consideration of the interactions between the particles and the correlations created by these interactions. In this article, the N-variable distribution function which maximizes the Uncertainty (Shannon’s information entropy) and admits as marginals a set of (N−1)-variable distribution functions, is, by definition, free of N-order correlations. This way to define correlations is valid for stochastic systems described by discrete variables or continuous variables, for equilibrium or non-equilibrium states and correlations of the different orders can be defined and measured. This allows building the grand-canonical expressions of the uncertainty valid for either a dilute gas system or a dense gas system. At equilibrium, for both kinds of systems, the uncertainty becomes identical to the expression of the thermodynamic entropy. Two interesting by-products are also provided by the method: (i) The Kirkwood superposition approximation (ii) A series of generalized superposition approximations. A theorem on the temporal evolution of the relevant uncertainty for molecular systems governed by two-body forces is proved and a conjecture closely related to this theorem sheds new light on the origin of the irreversibility of molecular systems. In this respect, the irreplaceable role played by the three-body interactions is highlighted.

Highlights

  • IntroductionIt allows measuring to what extent the outcome of an event described by some probability law is uncertain

  • Shannon’s formula [1,2,3,4] is extensively used in this study

  • The way of defining the uncorrelated distribution we have described in the introduction is fully consistent with equilibrium distributions of statistical mechanics: the Maxwell–Boltzmann distribution for non-interacting molecules

Read more

Summary

Introduction

It allows measuring to what extent the outcome of an event described by some probability law is uncertain. This measure is called Uncertainty throughout this work to avoid confusion with entropy. The Shannon’s formula may be applied to the original joint probability distribution as well as to the product P(x1 )P(x2 )P(x3 ) . The difference between the two expressions vanishes if all variables are independent and in general, it measures the non-independence among the N variables. This measure does not indicate to what extent the non-independence among the N variables is due to pairs, triples and higher order interactions

Methods
Discussion
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call