ABSTRACT Accompanying Part I, this sequel delineates a validation of the recently proposed AI for Regularization in radio-interferometric Imaging (AIRI) algorithm on observations from the Australian Square Kilometre Array Pathfinder (ASKAP). The monochromatic AIRI-ASKAP images showcased in this work are formed using the same parallelized and automated imaging framework described in Part I: ‘uSARA validated on ASKAP data’. Using a Plug-and-Play approach, AIRI differs from uSARA by substituting a trained denoising deep neural network (DNN) for the proximal operator in the regularization step of the forward–backward algorithm during deconvolution. We build a trained shelf of DNN denoisers that target the estimated image dynamic ranges of our selected data. Furthermore, we quantify variations of AIRI reconstructions when selecting the nearest DNN on the shelf versus using a universal DNN with the highest dynamic range, opening the door to a more complete framework that not only delivers image estimation but also quantifies epistemic model uncertainty. We continue our comparative analysis of source structure, diffuse flux measurements, and spectral index maps of selected target sources as imaged by AIRI and the algorithms in Part I – uSARA and WSClean. Overall, we see an improvement over uSARA and WSClean in the reconstruction of diffuse components in AIRI images. The scientific potential delivered by AIRI is evident in further imaging precision, more accurate spectral index maps, and a significant acceleration in deconvolution time, whereby AIRI is four times faster than its subiterative sparsity-based counterpart uSARA.
Read full abstract