Abstract
The classical information-theoretic measures such as the entropy and the mutual information (MI) are widely applicable to many areas in science and engineering. Csiszar generalized the entropy and the MI by using the convex functions. Recently, we proposed the grid occupancy (GO) and the quasientropy (QE) as measures of independence. The QE explicitly includes a convex function in its definition, while the expectation of GO is a subclass of QE. In this paper, we study the effect of different convex functions on GO, QE, and Csiszar’s generalized mutual information (GMI). A quality factor (QF) is proposed to quantify the sharpness of their minima. Using the QF, it is shown that these measures can have sharper minima than the classical MI. Besides, a recursive algorithm for computing GMI, which is a generalization of Fraser and Swinney’s algorithm for computing MI, is proposed. Moreover, we apply GO, QE, and GMI to chaotic time series analysis. It is shown that these measures are good criteria for determining the optimum delay in strange attractor reconstruction.
Highlights
The advent of information theory was hallmarked by Shannon’s seminal paper [1]
The Shannon mutual information (MI) can be viewed as the Kullback divergence between the joint probability density function (PDF) and the product of marginal PDFs
The independence of variables is equivalent to the uniformity of the joint PDF of the variables obtained by transforming the original variables by their respective cumulative distribution function (CDF)
Summary
The advent of information theory was hallmarked by Shannon’s seminal paper [1]. In that paper, some fundamental measures of information were established, among which is the entropy. Some convex functions yield QFs with higher order than that of the minus Shannon entropy and MI, which implies that the related measures have sharper minima than the minus Shannon entropy and MI do. Fraser and Swinney used the first minimum of the Shannon MI for choosing delay according to Shaw’s suggestion They proposed a recursive algorithm for computing Shannon MI, and they showed that the MI is more advantageous than the correlation function that only takes into account second order dependence [17]. The word “entropy,” when appears alone, refers to the Shannon entropy and, the phrase “MI,” when appears alone, refers to the Shannon MI, as is the common usage
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.