ABSTRACT Machine learning is a promising tool to reconstruct time-series phenomena, such as variability of active galactic nuclei (AGNs), from sparsely sampled data. Here, we use three Continuous Autoregressive Moving Average (CARMA) representations of AGN variability – the Damped Random Walk (DRW) and (over/under)Damped Harmonic Oscillator – to simulate 10-yr AGN light curves as they would appear in the upcoming Vera Rubin Observatory Legacy Survey of Space and Time (LSST), and provide a public tool to generate these for any survey cadence. We investigate the impact on AGN science of five proposed cadence strategies for LSST’s primary Wide-Fast-Deep (WFD) survey. We apply for the first time in astronomy a novel Stochastic Recurrent Neural Network (SRNN) algorithm to reconstruct input light curves from the simulated LSST data, and provide a metric to evaluate how well SRNN can help recover the underlying CARMA parameters. We find that the light-curve reconstruction is most sensitive to the duration of gaps between observing season, and that of the proposed cadences, those that change the balance between filters, or avoid having long gaps in the g band perform better. Overall, SRNN is a promising means to reconstruct densely sampled AGN light curves and recover the long-term structure function of the DRW process (SF∞) reasonably well. However, we find that for all cadences, CARMA/SRNN models struggle to recover the decorrelation time-scale (τ) due to the long gaps in survey observations. This may indicate a major limitation in using LSST WFD data for AGN variability science.
Read full abstract