Abstract
We train convolutional neural networks to predict whether or not a set of measurements is informationally complete to uniquely reconstruct any given quantum state with no prior information. In addition, we perform fidelity benchmarking based on this measurement set without explicitly carrying out state tomography. The networks are trained to recognize the fidelity and a reliable measure for informational completeness. By gradually accumulating measurements and data, these trained convolutional networks can efficiently establish a compressive quantum-state characterization scheme by accelerating runtime computation and greatly reducing systematic drifts in experiments. We confirm the potential of this machine-learning approach by presenting experimental results for both spatial-mode and multiphoton systems of large dimensions. These predictions are further shown to improve when the networks are trained with additional bootstrapped training sets from real experimental data. Using a realistic beam-profile displacement error model for Hermite–Gaussian sources, we further demonstrate numerically that the orders-of-magnitude reduction in certification time with trained networks greatly increases the computation yield of a large-scale quantum processor using these sources, before state fidelity deteriorates significantly.
Highlights
Recent advances in quantum algorithms and error correction [1,2,3,4,5,6] have fueled the development of noisy intermediate-scale quantum computing devices
Proper data analysis first entails the extraction of physical probabilities from the accumulated data, which can be done with well-established statistical methods, such as those of maximum likelihood (ML) [23, 24, 73,74,75] and least squares (LS) [76, 77], subject to the physical constraint of density matrices
Simulations—neural-network performances We first present performance graphs of informational completeness certification net (ICCNet) and fidelity prediction net (FidNet) in figure 7 based on two sets of simulations on four-qubit states (d = 16) using random measurement bases generated with the Haar measure for the unitary group, and bases found using adaptive compressive tomography (ACT)
Summary
Recent advances in quantum algorithms and error correction [1,2,3,4,5,6] have fueled the development of noisy intermediate-scale quantum computing devices. A crucial ingredient in these methods is informational completeness certification (ICC) that determines whether or not a given measurement set and its corresponding data is informationally complete (IC). This is done by computing a uniqueness measure based on the given measurements. When sCVX > 0, there is equivalently a convex set of state estimators that are consistent with the physical probabilities It can be shown [52] that a unique estimator is obtained from the measured POVM and corresponding data if and only if sCVX = 0
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.