Abstract

Our ability to characterize astrophysical gravitational waves depends on our understanding of the detectors used to observe them. Specifically, our ability to calibrate current kilometer-scale interferometers can potentially confound the inference of astrophysical signals. Current calibration uncertainties are dominated by systematic errors between the modeled and observed detector responses and are well described by a Gaussian process. I exploit this description to analytically examine the impact of calibration uncertainty. I derive closed-form expressions for the conditioned likelihood of the calibration error given the observed data and an astrophysical signal (astrophysical calibration) as well as for the marginal likelihood for the data given a signal (integrated over the calibration uncertainty). I show that calibration uncertainty always reduces search sensitivity and the amount of information available about astrophysical signals. Additionally, calibration uncertainty will fundamentally limit the precision to which loud signals can be constrained, a crucial factor when considering the scientific potential of proposed third-generation interferometers. For example, I estimate that with 1% uncertainty in the detector response's amplitude and phase, one will only be able to measure the leading-order tidal parameter ($\stackrel{\texttildelow{}}{\mathrm{\ensuremath{\Lambda}}}$) for a $1.4+1.4\text{ }\text{ }{M}_{\ensuremath{\bigodot}}$ system to better than $\ifmmode\pm\else\textpm\fi{}1$ ($\ensuremath{\sim}0.2%$ relative uncertainty) for signals with signal-to-noise ratios $\ensuremath{\gtrsim}{10}^{4}$. At this signal-to-noise ratio, calibration uncertainty increases ${\ensuremath{\sigma}}_{\stackrel{\texttildelow{}}{\mathrm{\ensuremath{\Lambda}}}}$ by a factor of 2 compared to stationary Gaussian noise alone. Furthermore, 1% calibration uncertainty limits the precision to always be ${\ensuremath{\sigma}}_{\stackrel{\texttildelow{}}{\mathrm{\ensuremath{\Lambda}}}}\ensuremath{\gtrsim}0.5$. At more modest signal-to-noise ratios ($\ensuremath{\lesssim}30$) characteristic of the current set of detected events, the uncertainty in $\stackrel{\texttildelow{}}{\mathrm{\ensuremath{\Lambda}}}$ will be ${\ensuremath{\sigma}}_{\stackrel{\texttildelow{}}{\mathrm{\ensuremath{\Lambda}}}}\ensuremath{\gtrsim}100$ and will be dominated by the Gaussian noise rather than calibration uncertainty. I also show how to best select the frequencies at which calibration should be precisely constrained in order to minimize the information lost about astrophysical parameters. It is not necessary to constrain the calibration errors to be small at all frequencies to perform precise astrophysical inference for individual signals.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call