Abstract

ABSTRACT Most active supermassive black holes in present-day galaxies are underfed and consist of low-luminosity active galactic nuclei (LLAGN). LLAGNs display complex multiwavelength broadband spectral energy distributions (SED), dominated by non-thermal processes which are explained to first order by a radiatively inefficient accretion flow (RIAF) and a relativistic jet. Due to the computational cost of generating such SEDs, it has not been hitherto possible to perform statistical fits to observed broadband SEDs, since such procedures require generating many thousands of models on-the-fly. Here, we have used a deep learning (DL) method to interpolate a large grid consisting of dozens of thousands of model SEDs for RIAFs and jets covering the parameter space appropriate for LLAGNs. Not only the DL method computes accurate models, it does so hundreds of thousands of times faster than solving the underlying dynamical and radiative transfer equations. This brings RIAF and jet models to the realm of Bayesian inference. We demonstrate that the combination of a DL interpolator and a Markov chain Monte Carlo ensemble sampler can recover the ground truth parameters of Mock LLAGN data. We apply our model to existing radio-to-X-rays observations of three LLAGNs: M87, NGC 315, and NGC 4261. We demonstrate that our model can estimate the relevant parameters of these accreting black holes such as the mass accretion and outflow rate at a small fraction of the computational cost of previous approaches.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call