Abstract

Most searches for continuous gravitational-waves from pulsars use Taylor expansions in the phase to model the spin-down of neutron stars. Studies of pulsars demonstrate that their electromagnetic (EM) emissions suffer from \emph{timing noise}, small deviations in the phase from Taylor expansion models. How the mechanism producing EM emission is related to any continuous gravitational-wave (CW) emission is unknown; if they either interact or are locked in phase then the CW will also experience timing noise. Any disparity between the signal and the search template used in matched filtering methods will result in a loss of signal-to-noise ratio (SNR), referred to as `mismatch'. In this work we assume the CW suffers a similar level of timing noise to its EM counterpart. We inject and recover fake CW signals, which include timing noise generated from observational data on the Crab pulsar. Measuring the mismatch over durations of order $\sim 10$ months, the effect is for the most part found to be small. This suggests recent so-called `narrow-band' searches which placed upper limits on the signals from the Crab and Vela pulsars will not be significantly affected. At a fixed observation time, we find the mismatch depends upon the observation epoch. Considering the averaged mismatch as a function of observation time, we find that it increases as a power law with time, and so may become relevant in long baseline searches.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call