Abstract

The upcoming direct detection of gravitational waves will open a window to probing the strong-field regime of general relativity (GR). As a consequence, waveforms that include the presence of deviations from GR have been developed (e.g. in the parametrized post-Einsteinian approach). TIGER, a data analysis pipeline which builds Bayesian evidence to support or question the validity of GR, has been written and tested. In particular, it was shown recently that data from the LIGO and Virgo detectors will allow to detect deviations from GR smaller than can be probed with Solar System tests and pulsar timing measurements or not accessible with conventional tests of GR. However, evidence from several detections is required before a deviation from GR can be confidently claimed. An interesting consequence is that, should GR not be the correct theory of gravity in its strong field regime, using standard GR templates for the matched filter analysis of interferometer data will introduce biases in the gravitational wave measured parameters with potentially disastrous consequences on the astrophysical inferences, such as the coalescence rate or the mass distribution. We consider three heuristic possible deviations from GR and show that the biases introduced by assuming GR's validity manifest in various ways. The mass parameters are usually the most affected, with biases that can be as large as $30$ standard deviations for the symmetric mass ratio, and nearly one percent for the chirp mass, which is usually estimated with sub-percent accuracy. We conclude that statements about the nature of the observed sources, e.g. if both objects are neutron stars, depend critically on the explicit assumption that GR it the right theory of gravity in the strong field regime.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call