Abstract

The notion of Bregman divergence and sufficiency will be defined on general convex state spaces. It is demonstrated that only spectral sets can have a Bregman divergence that satisfies a sufficiency condition. Positive elements with trace 1 in a Jordan algebra are examples of spectral sets, and the most important example is the set of density matrices with complex entries. It is conjectured that information theoretic considerations lead directly to the notion of Jordan algebra under some regularity conditions.

Highlights

  • One of the main purposes of information theory is to compress data so that data can be recovered exactly or approximately

  • As we have seen monotonicity implies that the Bregman divergence must be proportional to inform divergence, which is jointly convex in both arguments

  • We see that in general joint convexity is not a sufficient condition for monotonicity, but in the case where the state space has only two orthogonal states it is not known if joint convexity of a Bregman divergence is sufficient to conclude that the Bregman divergence is monotone

Read more

Summary

Introduction

One of the main purposes of information theory is to compress data so that data can be recovered exactly or approximately. One of the most important quantities was called entropy because it is calculated according to a formula that mimics the calculation of entropy in statistical mechanics. Another key concept in information theory is information divergence (KL-divergence) that is defined for probability vectors P and Q as. First we introduce some general results about optimization on state spaces of finite dimensional C*-algebras. This part applies exactly to all the topics under consideration and lead to Bregman divergences or more general regret functions. We introduce several notions of sufficiency and show that this leads to information divergence. In a number of cases it is not possible or not relevant to impose the condition of sufficiency, which can explain why regret function are not always equal to information divergence

Structure of the State Space
Optimization
Information Theory
Scoring Rules
Statistical Mechanics
Portfolio Theory
We note that a portfolio may be traded Djust like
Sufficiency Conditions
Entropy and Information Divergence
Monotonicity
Sufficiency
Locality
Statistics
Monotone
Concluding Remarks
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call