Abstract

We present a follow-up method based on supervised machine learning (ML) to improve the performance in the search of gravitational wave (GW) burts from core-collapse supernovae (CCSNe) using the coherent WaveBurst (cWB) pipeline. The ML model discriminates noise from signal events using as features a set of reconstruction parameters provided by cWB. Detected noise events are discarded yielding to a reduction of the false alarm rate (FAR) and of the false alarm probability (FAP) thus enhancing of the statistical significance. We tested the proposed method using strain data from the first half of the third observing run of advanced LIGO, and CCSNe GW signals extracted from 3D simulations. The ML model is learned using a dataset of noise and signal events, and then it is used to identify and discard noise events in cWB analyses. Noise and signal reduction levels were examined in single detector networks (L1 and H1) and two detector network (L1H1). The FAR was reduced by a factor of $\sim10$ to $\sim100$, there was an enhancement in the statistical significance of $\sim1$ to $\sim2\sigma$, while there was no impact in detection efficiencies.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.