Abstract

In this paper, we present a general framework to solve a fundamental problem in random matrix theory (RMT), i.e. the problem of describing the joint distribution of eigenvalues of the sum of two independent random Hermitian matrices and . Some considerations about the mixture of quantum states are basically subsumed into the above mathematical problem. Instead, we focus on deriving the spectral density of the mixture of adjoint orbits of quantum states in terms of the Duistermaat–Heckman measure, originated from the theory of symplectic geometry. Based on this method, we can obtain the spectral density of the mixture of independent random states. In particular, we obtain explicit formulas for the mixture of random qubits. We also find that, in the two-level quantum system, the average entropy of the equiprobable mixture of n random density matrices chosen from a random state ensemble (specified in the text) increases with the number n. Hence, as a physical application, our results quantitatively explain that the quantum coherence of the mixture monotonously decreases statistically as the number of components n in the mixture. Besides, our method may be used to investigate some statistical properties of a special subclass of unital qubit channels.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.