Abstract
The purpose of the paper is to introduce, using the known results concerning the entropy in product MV algebras, the concepts of mutual information and Kullback–Leibler divergence for the case of product MV algebras and examine algebraic properties of the proposed measures. In particular, a convexity of Kullback–Leibler divergence with respect to states in product MV algebras is proved, and chain rules for mutual information and Kullback–Leibler divergence are established. In addition, the data processing inequality for conditionally independent partitions in product MV algebras is proved.
Highlights
The notions of entropy and mutual information are fundamental concepts in information theory [1]; they are used as measures of information obtained from a realization of the considered experiments
Let m1, m2 be states defined on a given product MV algebra ( M, ·), and A = { a1, ..., an } be a partition of ( M, ·)
Let m1, m2 be states defined on a product MV algebra ( M, ·), and A = { a1, ..., an } be a partition of ( M, ·). DA (m1 k m2 ) ≥ 0 (Gibb’s inequality) with the equality if and only if m1 = m2, for i = 1, 2, ..., n
Summary
The notions of entropy and mutual information are fundamental concepts in information theory [1]; they are used as measures of information obtained from a realization of the considered experiments. It was shown that the entropy of fuzzy partitions introduced and studied in [6] can be considered as a special case of their mutual information. The entropy and the conditional entropy of partitions in a product MV algebra satisfy all properties analogous to properties of Shannon’s entropy of measurable partitions in the classical case; the proofs can be found in [35,49,50]. We present those that will be further exploited.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have