Abstract

We investigate the concept of entropy in probabilistic theories more general than quantum mechanics, with particular reference to the notion of information causality (IC) recently proposed by Pawlowski et al (2009 arXiv:0905.2292). We consider two entropic quantities, which we term measurement and mixing entropy. In the context of classical and quantum theory, these coincide, being given by the Shannon and von Neumann entropies, respectively; in general, however, they are very different. In particular, while measurement entropy is easily seen to be concave, mixing entropy need not be. In fact, as we show, mixing entropy is not concave whenever the state space is a non-simplicial polytope. Thus, the condition that measurement and mixing entropies coincide is a strong constraint on possible theories. We call theories with this property monoentropic.Measurement entropy is subadditive, but not in general strongly subadditive. Equivalently, if we define the mutual information between two systems A and B by the usual formula I(A: B)=H(A)+H(B)-H(AB), where H denotes the measurement entropy and AB is a non-signaling composite of A and B, then it can happen that I(A:BC)<I(A:B). This is relevant to IC in the sense of Pawlowski et al: we show that any monoentropic non-signaling theory in which measurement entropy is strongly subadditive, and also satisfies a version of the Holevo bound, is informationally causal, and on the other hand we observe that Popescu–Rohrlich boxes, which violate IC, also violate strong subadditivity. We also explore the interplay between measurement and mixing entropy and various natural conditions on theories that arise in quantum axiomatics.

Highlights

  • We investigate the concept of entropy in probabilistic theories more general than quantum mechanics, with particular reference to the notion of information causality (IC) recently proposed by Pawlowski et al (2009 arXiv:0905.2292)

  • It is established in [27] that quantum mechanics—and classical probability theory—satisfies this IC constraint

  • In order to address this question, we develop some of the basic machinery of entropy, conditional entropy and mutual information in a very general probabilistic setting—an independently interesting problem, which seems not to have received much previous attention

Read more

Summary

General probabilistic models

There is a more-or-less standard mathematical framework for discussing general probabilistic models, going back at least to the work of Mackey in the 1950s, and further developed (or, in some cases, rediscovered) in succeeding decades by various authors [1, 13, 15, 16, 18, 25]. We shall be interested exclusively in discrete, finitedimensional systems From this point forward, we make the standing assumptions that (i) A is locally finite, meaning that all tests E ∈ A are finite sets, and (ii) is finite dimensional and closed. Local finiteness guarantees that the maximal state space (A) is compact; the closedness of the physical state space ensures that it, too, is compact9 It follows that every state can be represented as a finite convex combination, or mixture, of pure states, that is, extreme points of. A simple example that is neither classical nor quantum, and to which we shall refer often, is the ‘two-bit’ test space A2 = {{a, a }, {b, b }}, consisting of a pair of two-outcome tests, depicted in figure 1.

Measurement and mixing entropies
Composite systems and joint entropy
Composite systems
Data processing and the Holevo bound
Information causality
Theories satisfying IC
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call