Abstract

We investigate the quantitative constraint on the triple-α reaction rate based on stellar evolution theory, motivated by the recent significant revision of the rate proposed by nuclear physics calculations. Targeted stellar models were computed in order to investigate the impact of that rate in the mass range of 0.8≤M/M⊙≤25 and in the metallicity range between Z = 0 and Z = 0.02. The revised rate has a significant impact on the evolution of low-and intermediate-mass stars, while its influence on the evolution of massive stars (M > 10M⊙) is minimal. We find that employing the revised rate suppresses helium shell flashes on AGB phase for stars in the initial mass range 0.8≤M/M⊙≤6, which is contradictory to what is observed. The absence of helium shell flashes is due to the weak temperature dependence of the revised triple-α reaction cross section at the temperature involved. In our models, it is suggested that the temperature dependence of the cross section should have at least ν > 10 at T = 1−1.2×108K where the cross section is proportional to Tν. We also derive the helium ignition curve to estimate the maximum cross section to retain the low-mass first red giants. The semi-analytically derived ignition curves suggest that the reaction rate should be less than ∼ 10−29 cm6 s−1 mole−2 at ≈ 107.8 K, which corresponds to about three orders of magnitude larger than that of the NACRE compilation.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.