Abstract

As two neutron stars merge, they emit gravitational waves that can potentially be detected by Earth-bound detectors. Matched-filtering-based algorithms have traditionally been used to extract quiet signals embedded in noise. We introduce a novel neural-network-based machine learning algorithm that uses time series strain data from gravitational-wave detectors to detect signals from nonspinning binary neutron star mergers. For the Advanced LIGO design sensitivity, our network has an average sensitive distance of 130 Mpc at a false-alarm rate of ten per month. Compared to other state-of-the-art machine learning algorithms, we find an improvement by a factor of 4 in sensitivity to signals with a signal-to-noise ratio between 8 and 15. However, this approach is not yet competitive with traditional matched-filtering-based methods. A conservative estimate indicates that our algorithm introduces on average 10.2 s of latency between signal arrival and generating an alert. We give an exact description of our testing procedure, which can be applied not only to machine-learning-based algorithms but all other search algorithms as well. We thereby improve the ability to compare machine learning and classical searches.

Highlights

  • The first direct detection of a gravitational-wave (GW) signal on September 14, 2015 [1] marked the dawn of gravitational-wave astronomy

  • We find that the signal-to-noise ratio (SNR) estimate is able to resolve false-alarm rate (FAR) down to 0.6 per month, whereas the p-score output is able to resolve FARs down to 12 per month

  • Our previous work [58] was able to resolve FARs down to ≈ 30 per month and was tested on a set of roughly half the duration used in this paper

Read more

Summary

INTRODUCTION

The first direct detection of a gravitational-wave (GW) signal on September 14, 2015 [1] marked the dawn of gravitational-wave astronomy. [43,44] were the first to directly apply deep NNs to time series strain data to detect GWs from binary black hole (BBH) mergers They tested the sensitivity of these searches at estimated FARs Oð103Þ per month.. Both networks detected all signals with a SNR larger than 10 at estimated FARs of Oð104Þ per month These results are a promising first step, but the algorithms would need to be tested at the required FARs of one per 2 months on real detector data to demonstrate an improvement over established methods. Starting from their network, we reshaped the architecture significantly to optimize it to detect signals from binary neutron star (BNS) mergers.

FALSE-ALARM RATE AND SENSITIVITY OF GRAVITATIONAL-WAVE SEARCH ALGORITHMS
Calculation for general search algorithms
Calculation for neural network searches
DATA PROCESSING
Input data preparation
Generating training and validation set
Neural network architecture
Training
Testing on binary neutron star injections
RESULTS
Comparison to PyCBC Live
Comparison to another machine learning algorithm
Binary black hole injections
CONCLUSIONS

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.