Abstract

ABSTRACT When a star evolves into a red giant, the enhanced coupling between core-based gravity modes and envelope-based pressure modes forms mixed modes, allowing its deep interior to be probed by asteroseismology. The ability to obtain information about stellar interiors is important for constraining theories of stellar structure and evolution, for which the origin of various discrepancies between prediction and observation is still under debate. Ongoing speculation surrounds the possibility that some red giant stars may harbour strong (dynamically significant) magnetic fields in their cores, but interpretation of the observational data remains controversial. In part, this is tied to shortfalls in our understanding of the effects of strong fields on the seismic properties of gravity modes, which lies beyond the regime of standard perturbative methods. Here, we seek to investigate the effect of a strong magnetic field on the asymptotic period spacings of gravity modes. We use a Hamiltonian ray approach to measure the volume of phase space occupied by mode-forming rays, this being roughly proportional to the average density of modes (number of modes per unit frequency interval). A strong field appears to systematically increase this by about 10 per cent, which predicts a ∼10 per cent smaller period spacing. Evidence of near integrability in the ray dynamics hints that the gravity-mode spectrum may still exhibit pseudo-regularities under a strong field.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call