Abstract

Abstract When the shock wave generated in a supernova explosion breaks out of the stellar envelope, the first photons, typically in the X-ray-to-UV range, escape to the observer. Following this breakout emission, radiation from deeper shells diffuses out of the envelope as the supernova ejecta expands. Previous studies have shown that the radiation throughout the planar phase (i.e., before the expanding envelope has doubled its radius) originates in the same mass coordinate, called the “breakout shell.” We derive a self-similar solution for the radiation inside the envelope and show that this claim is incorrect and that the diffusion wave propagates logarithmically into the envelope (in a Lagrangian sense), rather than remaining at a fixed mass coordinate. The logarithmic correction implies that the luminosity originates in regions where the density is ∼10 times higher than previously thought, where the photon production rate is increased and helps thermalization. We show that this result has significant implications for the observed temperature. In our model, the radiation emitted from blue supergiant and Wolf–Rayet explosions is still expected to be out of thermal equilibrium during the entire planar phase, but the observed temperature will decrease by 2 orders of magnitude, contrary to previous estimates. Considering the conditions at the end of the planar phase, we also find how the temperature and luminosity transition into the spherical phase.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call