What is the minimum time required to take a temperature? In this paper, we solve this question for a large class of processes where temperature is inferred by measuring a probe (the thermometer) weakly coupled to the sample of interest, so that the probe's evolution is well described by a quantum Markovian master equation. Considering the most general control strategy on the probe (adaptive measurements, arbitrary control on the probe's state and Hamiltonian), we provide bounds on the achievable measurement precision in a finite amount of time, and show that in many scenarios these fundamental limits can be saturated with a relatively simple experiment. We find that for a general class of sample-probe interactions the scaling of the measurement uncertainty is inversely proportional to the time of the process, a shot-noise like behaviour that arises due to the dissipative nature of thermometry. As a side result, we show that the Lamb shift induced by the probe-sample interaction can play a relevant role in thermometry, allowing for finite measurement resolution in the low-temperature regime. More precisely, the measurement uncertainty decays polynomially with the temperature as T→0, in contrast to the usual exponential decay with T−1. We illustrate these general results for (i) a qubit probe interacting with a bosonic sample, where the role of the Lamb shift is highlighted, and (ii) a collective superradiant coupling between a N-qubit probe and a sample, which enables a quadratic decay with N of the measurement uncertainty.
Read full abstract