The scale of α-element yields is difficult to predict from theory because of uncertainties in massive star evolution, supernova physics, and black hole formation, and it is difficult to constrain empirically because the impact of higher yields can be compensated by greater metal loss in galactic winds. We use a recent measurement of the mean iron yield of core collapse supernovae (CCSN) by Rodriguez et al., y¯Fecc=0.058±0.007M⊙ , to infer the scale of α-element yields by assuming that the plateau of [α/Fe] abundance ratios observed in low-metallicity stars represents the yield ratio of CCSN. For a plateau at [α/Fe]cc = 0.45, we find that the population-averaged yields of O and Mg are about equal to the solar abundance of these elements, logyOcc/ZO,⊙=logyMgcc/ZMg,⊙=−0.01±0.1 , where yXcc is the mass of element X produced by massive stars per unit mass of star formation. The inferred O and Fe yields agree with predictions of the Sukhbold et al. CCSN models assuming their Z9.6+N20 neutrino-driven engine, a scenario in which many progenitors with M < 40M ⊙ implode to black holes rather than exploding. The yields are lower than assumed in many models of the galaxy mass–metallicity relation, reducing the level of outflows needed to match observed abundances. Our one-zone chemical evolution models with η=Ṁout/Ṁ*≈0.6 evolve to solar metallicity at late times. By further requiring that models reach [α/Fe] ≈ 0 at late times, we infer a Hubble-time integrated Type Ia supernova rate of 1.1×10−3M⊙−1 , compatible with estimates from supernova surveys.
Read full abstract