Abstract

There is growing evidence that M-dwarf stars suffer radius inflation when compared to theoretical models, suggesting that models are missing some key physics required to completely describe stars at effective temperatures $(T_{\rm SED})$ less than about 4000K. The advent of Gaia DR2 distances finally makes available large datasets to determine the nature and extent of this effect. We employ an all-sky sample, comprising of $>$15\,000 stars, to determine empirical relationships between luminosity, temperature and radius. This is accomplished using only geometric distances and multiwave-band photometry, by utilising a modified spectral energy distribution fitting method. The radii we measure show an inflation of $3 - 7\%$ compared to models, but no more than a $1 - 2\%$ intrinsic spread in the inflated sequence. We show that we are currently able to determine M-dwarf radii to an accuracy of $2.4\%$ using our method. However, we determine that this is limited by the precision of metallicity measurements, which contribute $1.7\%$ to the measured radius scatter. We also present evidence that stellar magnetism is currently unable to explain radius inflation in M-dwarfs.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call