Abstract

We present a time-frequency method to detect gravitational wave signals in interferometric data. This robust method can detect signals from poorly modeled and unmodeled sources. We evaluate the method on simulated data containing noise and signal components. The noise component approximates initial Laser Interferometric Gravitational Wave Observatory (LIGO) interferometer noise. The signal components have the time and frequency characteristics postulated by Flanagan and Hughes for binary black hole coalescence. The signals correspond to binaries with total masses between ${45M}_{\ensuremath{\bigodot}}$ to ${70M}_{\ensuremath{\bigodot}}$ and with (optimal filter) signal-to-noise ratios of 7 to 12. The method is implementable in real time, and achieves a coincident false alarm rate for two detectors $\ensuremath{\approx}1$ per 475 years. At this false alarm rate, the single detector false dismissal rate for our signal model is as low as 5.3% at a signal-to-noise ratio of 10. We expect to obtain similar or better detection rates with this method for any signal of similar power that satisfies certain adiabaticity criteria. Because optimal filtering requires knowledge of the signal waveform to high precision, we argue that this method is likely to detect signals that are undetectable by optimal filtering, which is at present the best developed detection method for transient sources of gravitational waves.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call