Abstract

Accurately determining dependency structure is critical to understanding a complex system’s organization. We recently showed that the transfer entropy fails in a key aspect of this—measuring information flow—due to its conflation of dyadic and polyadic relationships. We extend this observation to demonstrate that Shannon information measures (entropy and mutual information, in their conditional and multivariate forms) can fail to accurately ascertain multivariate dependencies due to their conflation of qualitatively different relations among variables. This has broad implications, particularly when employing information to express the organization and mechanisms embedded in complex systems, including the burgeoning efforts to combine complex network theory with information theory. Here, we do not suggest that any aspect of information theory is wrong. Rather, the vast majority of its informational measures are simply inadequate for determining the meaningful relationships among variables within joint probability distributions. We close by demonstrating that such distributions exist across an arbitrary set of variables.

Highlights

  • Information theory is a general, broadly applicable framework for understanding a system’s statistical properties [1]

  • In contrast, we considered only simple probability distributions without any assumptions as to how they might arise from the dynamics of interacting agents

  • Consider that we demonstrated that refinement is not strictly needed, since the partial information decomposition was able to discover the distribution’s internal structure without it

Read more

Summary

Introduction

Information theory is a general, broadly applicable framework for understanding a system’s statistical properties [1]. The inability of standard methods of decomposing the joint entropy to provide any semantic understanding of how information is shared has motivated entirely new methods of decomposing information [29,30] Common to all these is the fact that conditional mutual. We demonstrate a related, but deeper issue: Shannon information measures—entropy, mutual information and their conditional and multivariate versions—can fail to distinguish joint distributions with vastly differing internal dependencies. The second details a method of embedding an arbitrary distribution into a larger variable space using hierarchical dependencies, a technique we term “dependency diffusion” In this way, one sees that the initial concerns about information measures can arise in virtually any statistical multivariate analysis. We assume a working knowledge of information theory, such as found in standard textbooks [35,36,37,38]

Development
Discussion
Dyadic Camouflage and Dependency Diffusion
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call