Abstract
We consider the “partial information decomposition” (PID) problem, which aims to decompose the information that a set of source random variables provide about a target random variable into separate redundant, synergistic, union, and unique components. In the first part of this paper, we propose a general framework for constructing a multivariate PID. Our framework is defined in terms of a formal analogy with intersection and union from set theory, along with an ordering relation which specifies when one information source is more informative than another. Our definitions are algebraically and axiomatically motivated, and can be generalized to domains beyond Shannon information theory (such as algorithmic information theory and quantum information theory). In the second part of this paper, we use our general framework to define a PID in terms of the well-known Blackwell order, which has a fundamental operational interpretation. We demonstrate our approach on numerous examples and show that it overcomes many drawbacks associated with previous proposals.
Highlights
We proposed a new general framework for defining the partial information decomposition (PID)
Our framework was motivated in several ways, including a formal analogy with intersections and unions in set theory as well as an axiomatic derivation
One unusual aspect of our framework is that it provides separate measures of redundancy and union information
Summary
PID is motivated by an informal analogy with set theory [12]. In particular, redundancy is interpreted analogously to the size of the intersection of the sources X1 , . . . , Xn , while union information is interpreted analogously to the size of their union. We propose to define the PID by making this analogy formal, and in particular by going back to the algebraic definitions of intersection and union in set theory. Equations (5) and (6) are useful because they express the size of the intersection and union via an optimization over simpler terms (the size of individual sets, | T |, and the subset inclusion relation, ⊆) We translate these definitions to the information-theoretic setting of the PID. Union information I∪ is the minimum information about Y in any random variable that is more informative than all of the sources Readers who are more interested in the use of our framework to define concrete measures of redundancy and union information may skip to Section 5
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have