Reducing errors in quantum gates is critical to the development of quantum computers. To do so, any distortions in the control signals should be identified; however, conventional tools are not always applicable when part of the system is under high vacuum, cryogenic, or microscopic. Here, we demonstrate a method to detect and compensate for amplitude-dependent phase changes, using the qubit itself as a probe. The technique is implemented using a microwave-driven trapped-ion qubit, where correcting phase distortions leads to a threefold improvement in the error of single-qubit gates implemented with pulses of different amplitudes, to attain state-of-the-art performance benchmarked at 1.6(4)×10−6 error per Clifford gate. Published by the American Physical Society 2024
Read full abstract