ABSTRACT The turbulent ionosphere causes phase shifts to incoming radio waves on a broad range of temporal and spatial scales. When an interferometer is not sufficiently calibrated for the direction-dependent ionospheric effects, the time-varying phase shifts can cause the signal to decorrelate. The ionosphere’s influence over various spatiotemporal scales introduces a baseline-dependent effect on the interferometric array. We study the impact of baseline-dependent decorrelation on high-redshift observations with the Low Frequency Array (LOFAR). Data sets with a range of ionospheric corruptions are simulated using a thin-screen ionosphere model, and calibrated using the state-of-the-art LOFAR epoch of reionization pipeline. For the first time ever, we show the ionospheric impact on various stages of the calibration process including an analysis of the transfer of gain errors from longer to shorter baselines using realistic end-to-end simulations. We find that direction-dependent calibration for source subtraction leaves excess power of up to two orders of magnitude above the thermal noise at the largest spectral scales in the cylindrically averaged autopower spectrum under normal ionospheric conditions. However, we demonstrate that this excess power can be removed through Gaussian process regression, leaving no excess power above the 10 per cent level for a $5~$ km diffractive scale. We conclude that ionospheric errors, in the absence of interactions with other aggravating effects, do not constitute a dominant component in the excess power observed in LOFAR epoch of reionization observations of the North Celestial Pole. Future work should therefore focus on less spectrally smooth effects, such as beam modelling errors.
Read full abstract