Abstract

The main promise of quantum computing is to efficiently solve certain problems that are prohibitively expensive for a classical computer. Most problems with a proven quantum advantage involve the repeated use of a black box, or oracle, whose structure encodes the solution. One measure of the algorithmic performance is the query complexity, i.e., the scaling of the number of oracle calls needed to find the solution with a given probability. Few-qubit demonstrations of quantum algorithms, such as Deutsch–Jozsa and Grover, have been implemented across diverse physical systems such as nuclear magnetic resonance, trapped ions, optical systems, and superconducting circuits. However, at the small scale, these problems can already be solved classically with a few oracle queries, limiting the obtained advantage. Here we solve an oracle-based problem, known as learning parity with noise, on a five-qubit superconducting processor. Executing classical and quantum algorithms using the same oracle, we observe a large gap in query count in favor of quantum processing. We find that this gap grows by orders of magnitude as a function of the error rates and the problem size. This result demonstrates that, while complex fault-tolerant architectures will be required for universal quantum computing, a significant quantum advantage already emerges in existing noisy systems.

Highlights

  • The limited size of engineered quantum systems and their extreme susceptibility to noise sources have made it hard so far to establish a clear advantage of quantum over classical computing

  • The classical success probability has been exceeded in two-qubit demonstrations of the Deutsch–Jozsa[1] and Grover[2] algorithms, the required number of oracle queries has so far remained comparable

  • In view of the classical hardness of learning parity with noise (LPN), parity functions have been suggested as keys for secure and computationally easy authentication.[10, 11]

Read more

Summary

Introduction

The limited size of engineered quantum systems and their extreme susceptibility to noise sources have made it hard so far to establish a clear advantage of quantum over classical computing. The classical success probability has been exceeded in two-qubit demonstrations of the Deutsch–Jozsa[1] and Grover[2] algorithms, the required number of oracle queries has so far remained comparable. A particular learning task, known as binary classification, is to identify an unknown mapping between a set of bits onto 0 or 1. An example of binary classification is identifying a hidden parity function,[7, 8] defined by the unknown bit-string k, which computes f(D,k) = D · k mod 2 on a register of n data bits D = {D1,D2...,Dn} (Fig. 1a). Assuming that every bit introduces an equal error probability, the best known algorithms have a number of queries growing as O(n) and runtime growing almost exponentially with n.7–9. Assuming that every bit introduces an equal error probability, the best known algorithms have a number of queries growing as O(n) and runtime growing almost exponentially with n.7–9 In view of the classical hardness of learning parity with noise (LPN), parity functions have been suggested as keys for secure and computationally easy authentication.[10, 11]

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call