Abstract

Measuring the stellar parameters of A-type stars is more difficult than FGK stars because of the sparse features in their spectra and the degeneracy between effective temperature (T eff) and gravity (log g). Modeling the relationship between fundamental stellar parameters and features through machine learning is possible because we can employ the advantage of big data rather than sparse known features. As soon as the model is successfully trained, it can be an efficient approach for predicting T eff and log g for A-type stars especially when there is large uncertainty in the continuum caused by flux calibration or extinction. In this paper, A-type stars are selected from LAMOST DR7 with a signal-to-noise ratio greater than 50 and the T eff ranging within 7000 to 10,000 K. We perform the Random Forest (RF) algorithm, one of the most widely used machine learning algorithms to establish the regression relationship between the flux of all wavelengths and their corresponding stellar parameters (T eff) and (log g) respectively. The trained RF model not only can regress the stellar parameters but also can obtain the rank of the wavelength based on their sensibility to parameters. According to the rankings, we define line indices by merging adjacent wavelengths. The objectively defined line indices in this work are amendments to Lick indices including some weak lines. We use the Support Vector Regression algorithm based on our new defined line indices to measure the temperature and gravity and use some common stars from Simbad to evaluate our result. In addition, the Gaia Hertzsprung-Russell diagram is used for checking the accuracy of T eff and log g.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call