Abstract
Line intensity ratios (LIRs) of helium (He) atoms are known to depend on electron density, ne , and temperature, Te , and thus are widely utilized to evaluate these parameters, known as the He I LIR method. In this conventional method, the measured LIRs are compared with theoretical values calculated using a collisional-radiative (CR) model to find the best possible ne and Te . Basic CR models have been improved to take into account several effects. For instance, radiation trapping can occur to a significant degree in weakly ionized plasmas, leading to major alterations of LIRs. This effect has been included with optical escape factors in CR models. A new approach to the evaluation of ne and Te from He I LIRs has recently been explored using machine learning (ML). In the ML-aided LIR method, a predictive model is developed with training data, which consists of an input (measured LIRs) and a desired/known output (measured ne or Te from other diagnostics). It has been demonstrated that this new method predicts ne and Te better than using the conventional method coupled with a CR model, not only for He but also for other species. This review focuses mainly on low-temperature plasmas with Te⩽10 eV in linear plasma devices.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.