ABSTRACT A key feature of active galactic nuclei (AGN) is their variability across all wavelengths. Typically, AGN vary by a few tenths of a magnitude or more over periods lasting from hours to years. By contrast, extreme variability of AGN – large luminosity changes that are a significant departure from the baseline variability – are known as AGN flares. These events are rare and their time-scales poorly constrained, with most of the literature focusing on individual events. It has been suggested that extreme AGN variability including flares can provide insights into the accretion processes in the disc. With surveys such as the Legacy Survey of Space and Time promising millions of transient detections per night in the coming decade, there is a need for fast and efficient classification of AGN flares. The problem with the systematic detection of AGN flares is the requirement to detect them against a stochastically variable baseline; the ability to define a signal as a significant departure from the ever-present variability is a statistical challenge. Recently, Gaussian Processes have revolutionized the analysis of time-series data in many areas of astronomical research. They have, however, seen limited uptake within the field of transient detection and classification. Here, we investigate the efficacy of Gaussian Processes to detect AGN flares in both simulated and real optical light curves. We show that GP analysis can successfully detect AGN flares with a false-positive rate of less than seven per cent, and we present examples of AGN light curves that show extreme variability.
Read full abstract