Abstract

We formalize the notion of the dependency structure of a collection of multiple signals, relevant from the perspective of information theory, artificial intelligence, neuroscience, complex systems and other related fields. We model multiple signals by commutative diagrams of probability spaces with measure-preserving maps between some of them. We introduce the asymptotic entropy (pseudo-)distance between diagrams, expressing how much two diagrams differ from an information-processing perspective. If the distance vanishes, we say that two diagrams are asymptotically equivalent. In this context, we prove an asymptotic equipartition property: any sequence of tensor powers of a diagram is asymptotically equivalent to a sequence of homogeneous diagrams. This sequence of homogeneous diagrams expresses the relevant dependency structure.

Highlights

  • According to usual modeling assumptions in information theory, a discrete signal is cut into a collection of long words of length n, whose particular representation is irrelevant, and small errors are allowed

  • We formalize the notion of the dependency structure of a collection of multiple signals, relevant from the perspective of information theory, artificial intelligence, neuroscience, complex systems and other related fields

  • We prove an asymptotic equipartition property: any sequence of tensor powers of a diagram is asymptotically equivalent to a sequence of homogeneous diagrams

Read more

Summary

Introduction

According to usual modeling assumptions in information theory, a discrete signal is cut into a collection of long words of length n, whose particular representation is irrelevant (each word is considered as an atomic object without inner structure), and small errors are allowed. Information Geometry (2018) 1:237–285 the signal, namely the entropy: the exponential growth rate of the number of typical words of length n We elaborate on this point of view below in Sect. If one probes a measure-preserving dynamical system at a discrete sequence of times with a finite-output measurement device and counts measurement trajectories of length n, while discarding rarely appearing, untypical ones, one arrives at the notion of entropy of a system-measurement pair Entropy, in this case, is the exponential growth rate of the number of typical trajectories with respect to the length n. In this article we characterize, under these modeling assumptions, the relevant invariants in multiple signals, that are obtained as i.i.d. samples from random variables We will explain our point of view on entropy for a single signal, that is, for a single probability space

Probability spaces and their entropy
Asymptotic equivalence
Diagrams of probability spaces
The entropy distances and asymptotic equivalence for diagrams
Asymptotic equipartition property
Definitions and results in random variable context
Category of probability spaces and diagrams
Categories
Singleton
Two-fan
A diamond diagram
Full diagram
Constant diagrams
Homogeneous diagrams
Universal construction of homogeneous diagrams
Conditioning Suppose a diagram X contains a fan
Entropy
The entropy distance
Entropy distance in the case of single probability spaces
Entropy distance for complete diagrams
The asymptotic entropy distance
Tensor product
The Slicing Lemma
Given a “two-tents” diagram
Distributions and types
Single probability spaces
Distributions on diagrams
Types for single probability spaces
Types for complete diagrams
The empirical two-fan
Distance between types
Asymptotic equipartition property for diagrams
Technical proofs
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.