Abstract

The sensitivity afforded by quantum sensors is limited by decoherence. Quantum error correction (QEC) can enhance sensitivity by suppressing decoherence, but it has a side effect: it biases a sensor's output in realistic settings. If unaccounted for, this bias can systematically reduce a sensor's performance in experiment, and also give misleading values for the minimum detectable signal in theory. We analyze this effect in the experimentally motivated setting of continuous-time QEC, showing both how one can remedy it, and how incorrect results can arise when one does not.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call