Abstract

The origin of errors in quantum computation implemented by linear optics is studied. The systematic errors of quantum gates, phase relaxation, amplitude dumping, and misreadout are considered as error sources, and the errors which occurred in a four-bit Deutsch-Jozsa algorithm experiment are categorized according to the sources. The increase in the error rate with the expansion of the input size in the Deutsch-Jozsa algorithm is also studied and it was found that the demonstration of 11 qubits using linear optics and a single photon with less than a 20% error rate is achievable by the technique used in the experiment.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call