Abstract

ABSTRACT Solution intervals are often used to improve the signal-to-noise ratio during radio interferometric gain calibration. This work investigates how factors such as the noise level, intrinsic gain variability, degree of model incompleteness, and the presence of radio frequency interference impact the selection of solution intervals for calibration. We perform different interferometric simulations to demonstrate how these factors, in combination with the choice of solution intervals, affect calibration and imaging outputs and discuss practical guidelines for choosing optimal solution intervals. Furthermore, we present an algorithm capable of automatically selecting suitable solution intervals during calibration. By applying the algorithm to both simulated and real data, we show that it can successfully choose solution intervals that strike a good balance between capturing intrinsic gain variability and not fitting noise as long as the data are not too inhomogeneously flagged. Furthermore, we elaborate on several practical aspects that emphasize the need to develop regularized calibration algorithms that do not require solution intervals.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.