Abstract

Inferring the properties of black holes and neutron stars is a key science goal of gravitational-wave (GW) astronomy. To extract as much information as possible from GW observations we must develop methods to reduce the cost of Bayesian inference. In this paper, we use artificial neural networks (ANNs) and the parallelisation power of graphics processing units (GPUs) to improve the surrogate modelling method, which can produce accelerated versions of existing models. As a first application of our method, ANN-Sur, we build a time-domain surrogate model of the spin-aligned binary black hole (BBH) waveform model SEOBNRv4. We achieve median mismatches of 2e-5 and mismatches no worse than 2e-3. For a typical BBH waveform generated from 12 Hz with a total mass of $60 M_\odot$ the original SEOBNRv4 model takes 1812 ms. Existing bespoke code optimisations (SEOBNRv4opt) reduced this to 91.6 ms and the interpolation based, frequency-domain surrogate SEOBNRv4ROM can generate this waveform in 6.9 ms. Our ANN-Sur model, when run on a CPU takes 2.7 ms and just 0.4 ms when run on a GPU. ANN-Sur can also generate large batches of waveforms simultaneously. We find that batches of up to 10^4 waveforms can be evaluated on a GPU in just 163 ms, corresponding to a time per waveform of 0.016 ms. This method is a promising way to utilise the parallelisation power of GPUs to drastically increase the computational efficiency of Bayesian parameter estimation.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call