Ground states ofinteracting QFTs are non-Gaussian states, i.e. their connected n-point correlation functions do not vanish for , in contrast to the free QFT case. We show that, when the ground state of an interacting QFT evolves under a free massive QFT for a long time (a scenario that can be realised by a quantum quench), the connected correlation functions decay and all local physical observables equilibrate to values that are given by a Gaussian density matrix that retains memory only of the two-point initial correlation function. The argument hinges upon the fundamental physical principle of cluster decomposition, which is valid for the ground state of a general QFT. An analogous result was already known to be valid in the case of d = 1 spatial dimensions, where it is a special case of the so-called generalised Gibbs ensemble (GGE) hypothesis, and we now generalise it to higher dimensions. Moreover, in the case of massless free evolution, despite the fact that the evolution may lead not to equilibration but instead to unbounded increase of correlations with time, the GGE gives correctly the leading-order asymptotic behaviour of correlation functions in the thermodynamic and large time limit. The demonstration is performed in the context of a bosonic relativistic QFT, but the arguments apply more generally.
Read full abstract