Abstract

Gaussian boson sampling is one of the main promising approaches to realizing the quantum computational advantage, which also offers potential applications such as in dense subgraphs problem and quantum chemistry. However, the inevitable noise in experiment may weaken the quantum advantage of Gaussian boson sampling. Photon loss and photon partial indistinguishability are two major sources of noise. Their influence on the complexity of Gaussian boson sampling has been extensively studied in previous work. However, the phase noise of the input light source, a noise which is suitable for tailored for Gaussian boson sampling, has not been studied so far. Here, we investigate the phase noise of the input light source in Gaussian boson sampling through numerical simulation. We use the Monte Carlo method to calculate the output probability distribution under phase noise approximately. It is found that the phase noise of the light source can cause the input state to change from a Gaussian state into a non-Gaussian mixed state. For a given phase noise level, the fidelity of the non-Gaussian mixed state and the noise-free ideal state decreases monotonically as the mean photon number of input increases. Meanwhile, owing to the phase noise the deviation of the output probability distribution gradually increases with the number of detected photons increasing. Furthermore, the phase noise results in the capability of heavy sample generation (HOG), significantly decreasing. Finally, it is found that Gaussian boson sampling with photon loss is more tolerant to phase noise than the lossless case given that the mean photon number of input is the same. Our study is helpful in suppressing the phase noise in large-scale Gaussian boson sampling experiments.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call