Abstract

Context. When stripped from their hydrogen-rich envelopes, stars with initial masses between ∼7 and 11 M⊙ may develop massive degenerate cores and collapse. Depending on the final structure and composition, the outcome can range from a thermonuclear explosion, to the formation of a neutron star in an electron-capture supernova (ECSN). It has recently been demonstrated that stars in this mass range may be more prone to disruption than previously thought: they may initiate explosive oxygen burning when their central densities are still below ρc ≲ 109.6 g cm−3. At the same time, their envelopes expand significantly, leading to the complete depletion of helium. This combination makes them interesting candidates for type Ia supernovae–which we call (C)ONe SNe Ia–and this might have broader implications for the formation of neutron stars via ECSNe. Aims. To constrain the observational counterparts of (C)ONe SNe Ia and the key properties that enable them, it is crucial to constrain the evolution, composition, and precollapse structure of their progenitors, as well as the evolution of these quantities with cosmic time. In turn, this requires a detailed investigation of the final evolutionary stages preceding the collapse, and their sensitivity to input physics. Methods. Here, we modeled the evolution of 252 single, nonrotating helium stars covering the initial mass range 0.8 − 3.5 M⊙, with metallicities between Z = 10−4 and 0.02, and overshoot efficiency factors from fOV = 0.0 to 0.016 across all convective boundaries. We used these models to constrain several properties of these stars, including their central densities, compositions, envelope masses, and radii at the onset explosive oxygen ignition, as well as the final outcome as a function of initial helium star mass. We further investigate the sensitivity of these properties to mass loss rate assumptions using an additional grid of 110 models with varying wind efficiencies. Results. We find that helium star models with masses between ∼1.8 and 2.7 M⊙ are able to evolve onto 1.35−1.37 M⊙ (C)ONe cores that initiate explosive burning at central densities between log10(ρc/g cm−3) ∼ 9.3 and 9.6. We constrained the amount of residual carbon retained after core carbon burning as a function of initial conditions, and conclude that it plays a critical role in determining the final outcome: Chandrasekhar-mass degenerate cores that retain more than approximately 0.005 M⊙ of carbon result in (C)ONe SNe Ia, while those with lower carbon mass become ECSNe. We find that (C)ONe SNe Ia are more likely to occur at high metallicities, whereas at low metallicities ECSNe dominate. However, both SN Ia and ECSN progenitors expand significantly during the final evolutionary stages, so that for the most extended models, a further binary interaction may occur. We constrain the relative ratio between (C)ONe SNe Ia and SNe Ib/c to be 0.17−0.30 at Z = 0.02, and 0.03−0.13 at Z ≤ 10−3. Conclusions. We conclude with a discussion on potential observational properties of (C)ONe SNe Ia and their progenitors. In the few thousand years leading to the explosion, at least some progenitors should be identifiable as luminous metal-rich super-giants, embedded in hydrogen-free circumstellar nebulae.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.