Abstract

With the advent of gravitational wave astronomy, techniques to extend the reach of gravitational wave detectors are desired. In addition to the stellar-mass black hole and neutron star mergers already detected, many more are below the surface of the noise, available for detection if the noise is reduced enough. Our method (DeepClean) applies machine learning algorithms to gravitational wave detector data and data from on-site sensors monitoring the instrument to reduce the noise in the time-series due to instrumental artifacts and environmental contamination. This framework is generic enough to subtract linear, non-linear, and non-stationary coupling mechanisms. It may also provide handles in learning about the mechanisms which are not currently understood to be limiting detector sensitivities. The robustness of the noise reduction technique in its ability to efficiently remove noise with no unintended effects on gravitational-wave signals is also addressed through software signal injection and parameter estimation of the recovered signal. It is shown that the optimal SNR ratio of the injected signal is enhanced by $\sim 21.6\%$ and the recovered parameters are consistent with the injected set. We present the performance of this algorithm on linear and non-linear noise sources and discuss its impact on astrophysical searches by gravitational wave detectors.

Highlights

  • The recent detections of gravitational waves from binary systems motivates technological and data analysis improvements to extend the reach of current gravitational-wave detectors

  • The current network consists of the two advanced Laser Interferometer Gravitational-Wave Observatory interferometers in the United States [2], the Advanced Virgo interferometer in Italy [3], the GEO-HF interferometer in Germany [4], the Kamioka Gravitational-Wave Detector (KAGRA) interferometer in Japan [5], and eventually the LIGO-India detector in India [6]

  • The range obtained from DeepClean is similar to within 1–2% of that from the Wiener filter result, suggesting that the network has learned the coefficients of the optimal mean squared error (MSE) filter and captured physical couplings without overfitting or adding any additional noise

Read more

Summary

INTRODUCTION

The recent detections of gravitational waves from binary systems (see Ref. [1] for a summary of the first two observing runs) motivates technological and data analysis improvements to extend the reach of current gravitational-wave detectors. Once categorized into causally distinct groups, we can predict the instrument’s performance from the incoherent sum of these noise mechanisms and compare it to the observed steady-state sensitivity This is a crucial analysis when working to understand and improve the performance, as a diagnosis of what aspects or subsystems of the detector are the limiting factors. It shows us where the observed noise exceeds the sum of the budgeted noise sources, and thereby where our understanding of the noise is incomplete. The recorded signal from GW150914 only spent about 200 ms in the sensitive band of the instrument and was resolvable from about 35–250 Hz [9]

BACKGROUND
NOISE SUBTRACTION PIPELINE
Formalism and loss function
Data preprocessing
Neural network architecture
Training and inference
Output data postprocessing
PIPELINE PERFORMANCE ON LIGO DATA
O2 jitter noise
O3 60-Hz sidebands
PARAMETER ESTIMATION AND NETWORK SAFETY
Findings
CONCLUSIONS
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call