Abstract

About 160 years ago, the concept of entropy was introduced in thermodynamics by Rudolf Clausius. Since then, it has been continually extended, interpreted, and applied by researchers in many scientific fields, such as general physics, information theory, chaos theory, data mining, and mathematical linguistics. This paper presents The Entropy Universe, which aims to review the many variants of entropies applied to time-series. The purpose is to answer research questions such as: How did each entropy emerge? What is the mathematical definition of each variant of entropy? How are entropies related to each other? What are the most applied scientific fields for each entropy? We describe in-depth the relationship between the most applied entropies in time-series for different scientific fields, establishing bases for researchers to properly choose the variant of entropy most suitable for their data. The number of citations over the past sixteen years of each paper proposing a new entropy was also accessed. The Shannon/differential, the Tsallis, the sample, the permutation, and the approximate entropies were the most cited ones. Based on the ten research areas with the most significant number of records obtained in the Web of Science and Scopus, the areas in which the entropies are more applied are computer science, physics, mathematics, and engineering. The universe of entropies is growing each day, either due to the introducing new variants either due to novel applications. Knowing each entropy’s strengths and of limitations is essential to ensure the proper improvement of this research field.

Highlights

  • Despite its long history, to many, the term entropy still appears not to be understood

  • In 1948, the American Claude Shannon published “A mathematical theory of communication” in July and October issues of Bell System technical journal [33]. He proposed the notion of entropy to measure how the information within a signal can be quantified with absolute precision as the amount of unexpected data contained in the message

  • fuzzy entropy (FuzzyEn) is the negative natural logarithm of the probability that two similar vectors for m points remain similar for the m + 1 points. This measure of FuzzyEn is similar to approximate entropy (ApEn) and sample entropy (SampEn), replaces the 0-1 judgment of Heaviside function associated with ApEn and SampEn by a fuzzy relationship function [121], the family of exponential functions exp(−dinj/r), to get a fuzzy measurement of the two vectors similarity based on their shapes

Read more

Summary

Introduction

To many, the term entropy still appears not to be understood. In 2019, Namdari and Zhaojun [9] reviewed the entropy concept for uncertainty quantification of stochastic processes of lithium-ion battery capacity data. Those works do not present an in-depth analysis of how entropies are related to each other. Several researchers, such as the ones of reference [10,11,12,13,14,15,16,17,18,19], consider that entropy is an essential tool for time-series analysis and apply this measure in several research areas. We believe that a study of the importance and application of each entropy in time-series will help researchers understand and choose the most appropriate measure for their problem. Which are the areas of application of each entropy and their impact in the scientific community

Building the Universe of Entropies
Early Times of the Entropy Concept
Entropies Derived from Shannon Entropy
Differential Entropy
Spectral Entropy
Tone-Entropy
Wavelet Entropy
Empirical Mode Decomposition Energy Entropy
Calculate the energy entropy of IMF
Particular Cases of Rényi Entropy
Permutation Entropy and Related Entropies
Rank-Based Entropy and Bubble Entropy
Dispersion Entropy and Fluctuation-Based Dispersion Entropy
Fuzzy Entropy
Modified Sample Entropy
Fuzzy Measure Entropy
Kernel Entropies
2.10. Multiscale Entropy
Entropy Impact in the Scientific Community
Number of Citations
Conclusions
Scribner’s Sons
Methods
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call