Abstract

The decomposition of channel information into synergies of different order is an open, active problem in the theory of complex systems. Most approaches to the problem are based on information theory, and propose decompositions of mutual information between inputs and outputs in se\-veral ways, none of which is generally accepted yet. We propose a new point of view on the topic. We model a multi-input channel as a Markov kernel. We can project the channel onto a series of exponential families which form a hierarchical structure. This is carried out with tools from information geometry, in a way analogous to the projections of probability distributions introduced by Amari. A Pythagorean relation leads naturally to a decomposition of the mutual information between inputs and outputs into terms which represent single node information; pairwise interactions; and in general n-node interactions. The synergy measures introduced in this paper can be easily evaluated by an iterative scaling algorithm, which is a standard procedure in information geometry.

Highlights

  • In complex systems like biological networks, for example neural networks, a basic principle is that their functioning is based on the correlation and interaction of their different parts

  • If instead Y is given by the threewise parity function, or X1 X2 X3, we have again 1 bit of mutual information, which now is purely arising from a threewise synergy, so here k ∈ E3, and the only non-zero term in Equation (37) is d3(k)

  • We can see that the red lines approximate well the lines of constant mutual information, at least qualitatively, but they are not exactly equal

Read more

Summary

INTRODUCTION

In complex systems like biological networks, for example neural networks, a basic principle is that their functioning is based on the correlation and interaction of their different parts. In a system with highly correlated inputs, for example, the synergy would remain unseen, as it would be canceled by the redundancy This picture breaks down for more than three nodes. Another problem pointed out in Schneidmann et al (2003b) and Amari (2001) is that redundancy (as for example in X = Y = Z) can be described in terms of pairwise interactions, not triple, while synergy (as in the XOR function) is purely threewise. One may want to use this measure of synergy to form a complete decomposition analogous to Equation (8), but this does not work, as in general it is not true that d2 ≤ I(X:Y:Z) For this reason, we keep the decomposition more coarse, and we do not divide union information into unique and redundant. This allowed us to get precise quantities for all the examples considered

Technical Definitions
Mutual Information as Motivation
Extension to Channels
Two Inputs
One Input
Three Inputs
EXAMPLES
Single Node Channel
Split Channel
Correlated Inputs
AND and OR
XorLoses
XorDuplicate
GENERAL CASE
COMPARISON WITH TWO RECENT APPROACHES
CONCLUSION
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call