Abstract

The concept of entropy has played a significant role in thermodynamics and information theory, and is also a current research hotspot. Information entropy, as a measure of information, has many different forms, such as Shannon entropy and Deng entropy, but there is no unified interpretation of information from a measurement perspective. To address this issue, this article proposes Generalized Information Entropy (GIE) that unifies entropies based on mass function. Meanwhile, GIE establishes the relationship between entropy, fractal dimension, and number of events. Therefore, Generalized Information Dimension (GID) has been proposed, which extends the definition of information dimension from probability to mass fusion. GIE plays a role in approximation calculation and coding systems. In the application of coding, information from the perspective of GIE exhibits a certain degree of particle nature that the same event can have different representational states, similar to the number of microscopic states in Boltzmann entropy.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call