Abstract

Noise mitigation and reduction will be crucial for obtaining useful answers from near-term quantum computers. In this work, we present a general framework based on machine learning for reducing the impact of quantum hardware noise on quantum circuits. Our method, called noise-aware circuit learning (NACL), applies to circuits designed to compute a unitary transformation, prepare a set of quantum states, or estimate an observable of a many-qubit state. Given a task and a device model that captures information about the noise and connectivity of qubits in a device, NACL outputs an optimized circuit to accomplish this task in the presence of noise. It does so by minimizing a task-specific cost function over circuit depths and circuit structures. To demonstrate NACL, we construct circuits resilient to a fine-grained noise model derived from gate set tomography on a superconducting-circuit quantum device, for applications including quantum state overlap, quantum Fourier transform, and W-state preparation.

Highlights

  • Recent years have seen a surge in quantum computer hardware development, and we have several quantum computing platforms with tens of qubits that can be controlled and coupled with fidelities that enable execution of quantum circuits of limited depth

  • In this work we study how machine learning (ML) can be applied to formulate noise-aware quantum circuits that can be executed on near-term quantum hardware to produce reliable results

  • We demonstrate noise-aware circuit learning (NACL) using a fine-grained noise model derived from one- and two-qubit gate-set tomography (GST) [10,11,33] experiments run on the five-qubit IBM Q Ourense superconducting qubit device

Read more

Summary

INTRODUCTION

Recent years have seen a surge in quantum computer hardware development, and we have several quantum computing platforms with tens of qubits that can be controlled and coupled with fidelities that enable execution of quantum circuits of limited depth. The challenge is that naive compilations of most nontrivial quantum algorithms require circuit depths that are currently out of reach for near-term hardware Motivated by this challenge, in this work we study how machine learning (ML) can be applied to formulate noise-aware quantum circuits that can be executed on near-term quantum hardware to produce reliable results. Incorporating detailed noise models into one’s circuit optimization, as we do here, is compelling at present with the advent of advanced characterization techniques like gate-set tomography [10,11] These techniques produce fine-grained details—e.g., estimates of process matrices representing the action of imperfect quantum gates—describing the actual evolution of qubits in near-term hardware. II E we summarize the optimization methods used by NACL

Parameterized circuit
Device model
Parallelization
Cost functions
State preparation
Compilation
Machine learning algorithms
NOISE MODEL
IMPLEMENTATION FOR OBSERVABLE EXTRACTION
IMPLEMENTATION FOR STATE PREPARATION
Four-qubit W-state preparation
Five-qubit W-state preparation
IMPLEMENTATION FOR CIRCUIT COMPILATION
Findings
DISCUSSION AND CONCLUSIONS
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call