Abstract
We make use of a large set of fast simulations of an intensity mapping experiment with characteristics similar to those expected of the Square Kilometre Array (SKA) in order to study the viability and limits of blind foreground subtraction techniques. In particular, we consider different approaches: polynomial fitting, principal component analysis (PCA) and independent component analysis (ICA). We review the motivations and algorithms for the three methods, and show that they can all be described, using the same mathematical framework, as different approaches to the blind source separation problem. We study the efficiency of foreground subtraction both in the angular and radial (frequency) directions, as well as the dependence of this efficiency on different instrumental and modelling parameters. For well-behaved foregrounds and instrumental effects we find that foreground subtraction can be successful to a reasonable level on most scales of interest. We also quantify the effect that the cleaning has on the recovered signal and power spectra. Interestingly, we find that the three methods yield quantitatively similar results, with PCA and ICA being almost equivalent.
Highlights
The last few decades have seen a radical improvement in the quantity and quality of observational data that has transformed cosmology into a fully-fleshed data-driven science – so much so that it has become common to refer to the current status of the field as the ‘era of precision cosmology’
As we have shown in the previous three sections, the three blind methods under study in this paper can be understood as three different ways of approaching the least-squares problem of equations (4) and (6), each following different criteria in order to find the basis functions A
(iii) independent component analysis (ICA), like principal component analysis (PCA), maximizes the total variance but instead of uncorrelatedness it imposes statistical independence. Since these two properties are equivalent for Gaussian variables, PCA and ICA are mathematically equivalent when all the sources are Gaussian
Summary
The last few decades have seen a radical improvement in the quantity and quality of observational data that has transformed cosmology into a fully-fleshed data-driven science – so much so that it has become common to refer to the current status of the field as the ‘era of precision cosmology’. Astronomical observations in the optical range of wavelengths have supplied a wealth of data with which we have been able to characterize the late-time (z 1) evolution of the Universe and to make one of the most puzzling discoveries in cosmology: its accelerated expansion (Riess et al 1998; Perlmutter et al 1999). In this regime, galaxy redshift surveys (Contreras et al 2013; Anderson et al 2014) have allowed us to draw a plausible picture describing the evolution of the primordial density perturbations observed in the CMB to form the non-linear large-scale structure (LSS) that we observe today.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.