Abstract

ABSTRACT We have developed a new fully automated Artificial Intelligence (AI)-based method for deriving optimal models of complex absorption systems. The AI structure is built around VPFIT, a well-developed and extensively tested nonlinear least-squares code. The new method forms a sophisticated parallelized system, eliminating human decision-making and hence bias. Here, we describe the workings of such a system and apply it to synthetic spectra, in doing so establishing recommended methodologies for future analyses of Very Large Telescope (VLT) and Extremely Large Telescope (ELT) data. One important result is that modelling line broadening for high-redshift absorption components should include both thermal and turbulent components. Failing to do so means it is easy to derive the wrong model and hence incorrect parameter estimates. One topical application of our method concerns searches for spatial or temporal variations in fundamental constants. This subject is one of the key science drivers for the European Southern Observatory’s ESPRESSO spectrograph on the VLT and for the HIRES spectrograph on the ELT. The quality of new data demands completely objective and reproducible methods. The Monte Carlo aspects of the new method described here reveal that model non-uniqueness can be significant, indicating that it is unrealistic to expect to derive an unambiguous estimate of the fine structure constant α from one or a very small number of measurements. No matter how optimal the modelling method, it is a fundamental requirement to use a large sample of measurements to meaningfully constrain temporal or spatial α variation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call