Abstract

It is now a well-known fact that the correlations arising from local dichotomic measurements on an entangled quantum state may exhibit intrinsically non-classical features. In this paper we delve into a comprehensive study of random instances of such bipartite correlations. The main question we are interested in is: given a quantum correlation, taken at random, how likely is it that it is truly non-explainable by a classical model? We show that, under very general assumptions on the considered distribution, a random correlation which lies on the border of the quantum set is with high probability outside the classical set. What is more, we are able to provide the Bell inequality certifying this fact. On the technical side, our results follow from (i) estimating precisely the "quantum norm" of a random matrix, and (ii) lower bounding sharply enough its "classical norm", hence proving a gap between the two. Along the way, we need a non-trivial upper bound on the $\infty{\rightarrow}1$ norm of a random orthogonal matrix, which might be of independent interest.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.