Abstract

We present a maximum-likelihood (ML) algorithm that is fast enough to detect γ-ray transients in real time on low-performance processors often used for space applications. We validate the routine with simulations and find that, relative to algorithms based on excess counts, the ML method is nearly twice as sensitive, allowing detection of 240%–280% more short γ-ray bursts. We characterize a reference implementation of the code, estimating its computational complexity and benchmarking it on a range of processors. We exercise the reference implementation on archival data from the Fermi Gamma-ray Burst Monitor (GBM), verifying the sensitivity improvements. In particular, we show that the ML algorithm would have detected GRB 170817A even if it had been nearly 4 times fainter. We present an ad hoc but effective scheme for discriminating transients associated with background variations. We show that the onboard localizations generated by ML are accurate, but that refined off-line localizations require a detector response matrix with about 10 times finer resolution than is the current practice. Increasing the resolution of the GBM response matrix could substantially reduce the few-degree systematic uncertainty observed in the localizations of bright bursts.

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.