Abstract

Information processing based on the laws of quantum mechanics promises to be a revolutionary new avenue in information technology. This emerging field of quantum information processing (QIP) is however challenged by the fragile nature of the quantum bits (qubits) in which quantum information is stored and processed. An error in even a single qubit makes the quantum processor go off-track, corrupting the calculation as a whole. Therefore, the chance for an erroneous outcome increases with the number of qubits in the processor. Large-scale QIP thus hinges on the ability to correct for these errors. Classical information processing often uses error correction algorithms to identify errors by checking whether information is consistent in multiple copies. This strategy is unfortunately not applicable to QIP as quantum states cannot be copied. Moreover, direct measurements on qubits collapse their quantum states, reducing them to classical information. Fortunately, the theory of quantum error correction (QEC) overcomes these complications by encoding quantum information in entangled states of many qubits and performing parity measurements to identify errors in the system without destroying the encoded information. Implementing these codes is challenging as it requires many qubits and quick interleaving of operations and measurements. Moreover, to not introduce more errors in the system than QEC can solve for, these operations and measurements need to be of sufficient fidelity and speed...

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call