The Information Causality principle was proposed to re-derive the Tsirelson bound, an upper limit on the strength of quantum correlations, and has been suggested as a candidate law of nature. The principle states that the Shannon information about Alice's distant database gained by Bob after receiving an m bit message cannot exceed m bits, even when Alice and Bob share non-local resources. As originally formulated, it can be shown that the principle is violated exactly when the strength of the shared correlations exceeds the Tsirelson bound. However, we demonstrate here that when an alternative measure of information, one of the Renyi measures, is chosen, the Information Causality principle no longer arrives at the correct value for the Tsirelson bound. We argue that neither the assumption of particular 'intuitive' properties of uncertainties measures, nor pragmatic choices about how to optimise costs associated with communication, are sufficient to motivate uniquely the choice of the Shannon measure from amongst the more general Renyi measures. We conclude that the dependence of the success of Information Causality on mere convention undermines its claimed significance as a foundational principle.
Read full abstract