Abstract

The problem of how to properly quantify redundant information is an open question that has been the subject of much recent research. Redundant information refers to information about a target variable S that is common to two or more predictor variables X i . It can be thought of as quantifying overlapping information content or similarities in the representation of S between the X i . We present a new measure of redundancy which measures the common change in surprisal shared between variables at the local or pointwise level. We provide a game-theoretic operational definition of unique information, and use this to derive constraints which are used to obtain a maximum entropy distribution. Redundancy is then calculated from this maximum entropy distribution by counting only those local co-information terms which admit an unambiguous interpretation as redundant information. We show how this redundancy measure can be used within the framework of the Partial Information Decomposition (PID) to give an intuitive decomposition of the multivariate mutual information into redundant, unique and synergistic contributions. We compare our new measure to existing approaches over a range of example systems, including continuous Gaussian variables. Matlab code for the measure is provided, including all considered examples.

Highlights

  • Information theory was originally developed as a formal approach to the study of man-made communication systems [1,2]

  • We first review co-information and the Partial Information Decomposition (PID) before presenting Iccs, a new measure of redundancy based on quantifying the common change in surprisal between variables at the local or pointwise level [21,22,23,24,25]

  • While monotonicity has been considered a crucial axiom with the PID framework, we argue that subset equality, usually considered as part of the axiom of monotonicity, is the essential property that permits the use of the redundancy lattice

Read more

Summary

Introduction

Information theory was originally developed as a formal approach to the study of man-made communication systems [1,2]. Beer [6] present a mathematical lattice structure to represent the set theoretic intersections of the mutual information of multiple variables [9]. They use this to decompose the mutual information I (X ; S). Into terms quantifying the unique, redundant and synergistic information about the independent variable carried by each combination of dependent variables This gives a complete picture of the representational interactions in the system. We first review co-information and the PID before presenting Iccs , a new measure of redundancy based on quantifying the common change in surprisal between variables at the local or pointwise level [21,22,23,24,25]. We apply the new measure to continuous Gaussian variables [26]

Definitions
Interpretation
The Partial Information Decomposition
An Example PID: R DN U NQ X OR
Measuring Redundancy With Minimal Specific Information
Measuring Redundancy With Maximised Co-Information
Other Redundancy Measures
Measuring Redundancy With Pointwise Common Change in Surprisal
Derivation
Calculating Iccs
A Game-Theoretic Operational Definition of Unique Information
Maximum Entropy Optimisation
Properties
Implementation
Two Variable Examples
Binary Logical Operators
Dependence on Predictor-Predictor Correlation
A Problem With the Three Variable Lattice?
Giant Bit and Parity
X OR C OPY
Other Examples
Continuous Gaussian Variables
Discussion

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.