Abstract

We consider matter density effects in theories with a false ground state. Large and dense systems, such as stars, can destabilize a metastable minimum and allow for the formation of bubbles of the true minimum. We derive the conditions under which these bubbles form, as well as the conditions under which they either remain confined to the dense region or escape to infinity. The latter case leads to a phase transition in the universe at star formation. We explore the phenomenological consequences of such seeded phase transitions.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call