If there was a first-order phase transition in the early universe, there should be an associated stochastic background of gravitational waves. In this paper, we point out that the characteristic frequency of the spectrum due to phase transitions which took place in the temperature range $100\text{ }\text{ }\mathrm{GeV}--{10}^{7}\text{ }\text{ }\mathrm{GeV}$ is precisely in the window that will be probed by the second generation of space-based interferometers such as the big bang observer (BBO). Taking into account the astrophysical foreground, we determine the type of phase transitions which could be detected either at LISA, LIGO or BBO, in terms of the amount of supercooling and the duration of the phase transition that are needed. Those two quantities can be calculated for any given effective scalar potential describing the phase transition. In particular, the new models of electroweak symmetry-breaking which have been proposed in the last few years typically have a different Higgs potential from the standard model. They could lead to a gravitational wave signature in the milli-Hertz frequency, which is precisely the peak sensitivity of LISA. We also show that the signal coming from phase transitions taking place at $T\ensuremath{\sim}1--100\text{ }\text{ }\mathrm{TeV}$ could entirely screen the relic gravitational wave signal expected from standard inflationary models.
Read full abstract