Abstract

Entropy represents a universal concept in science suitable for quantifying the uncertainty of a series of random events. We define and describe this notion in an appropriate manner for physicists. We start with a brief recapitulation of the basic concept of the theory probability being useful for the determination of the concept of entropy. The history of how this concept came into its to-day exact form is sketched. We show that the Shannon entropy represents the most adequate measure of the probabilistic uncertainty of a random object. Though the notion of entropy has been introduced in classical thermodynamics as a thermodynamic state variable it relies on concepts studied in the theory of probability and mathematical statistics. We point out that whole formalisms of statistical mechanics can be rewritten in terms of Shannon entropy. The notion “entropy” is differently understood in various science disciplines: in classical physics it represents the thermodynamical state variable; in communication theory it represents the efficiency of transmission of communication; in the theory of general systems the magnitude of the configurational order; in ecology the measure for bio-diversity; in statistics the degree of disorder, etc. All these notions can be mapped on the general mathematical concept of entropy. By means of entropy, the configurational order of complex systems can be exactly quantified. Besides the Shannon entropy, there exists a class of Shannon-like entropies which converge, under certain circumstances, toward Shannon entropy. The Shannon-like entropy is sometimes easier to handle mathematically then Shannon entropy. One of the important Shannon-like entropy is well-known Tsallis entropy. The application of the Shannon and Shannon-like entropies in science is really versatile. Besides the mentioned statistical physics, they play a fundamental role in the quantum information, communication theory, in the description of disorder, etc.

Highlights

  • At the most fundamental level, all our further considerations rely on the concept of probability

  • The moment uncertainty measures are given as a rule by the higher statistical moments of a random variable x

  • From what has been said so far it follows: (i) The concept of entropy is inherently connected with the probability distribution of outcomes of a random trial

Read more

Summary

Introduction

At the most fundamental level, all our further considerations rely on the concept of probability. A random trial is characterized by a set of its outcomes (values) and the corresponding probability distribution. To any random trial it is assigned a random variable x which represents a mathematical quantity assuming a set of values with the corresponding probabilities Shannon entropy is a real and positive number It is a function only of the components of the probability distribution. Shannon entropy satisfies the following demands (see Appendix): (i) If the probability distribution contains only one component, e.g. (iii) For a uniform probability distribution Pu , H ( Pu ) becomes maximal In this case, the probabilities of all outcomes are equal, the mean uncertainty of such a random trial becomes maximum.

Entropy as a Qualificator of the Configurational Order
The Concept of Entropy in Thermodynamics and Statistical Physics
The Shannon-Like Entropies
Conclusions
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.