Abstract

We examine the effect of Galactic diffractive interstellar scintillation as a means of explaining the reported deficit of fast radio burst (FRB) detections at low Galactic latitude. We model the unknown underlying FRBflux density distribution as a power lawwith a rate scaling as S−5/2+δ ν and account for the fact that the FRBs are detected at unknown positions within the telescope beam. We find that the event rate of FRBs located off the Galactic plane may be enhanced by a factor of ∼30–300 per cent relative to objects near the Galactic plane without necessarily affecting the slope of the distribution. For FRBs whose flux densities are subject to relatively weak diffractive scintillation, as is typical for events detected at high Galactic latitudes, we demonstrate that an effect associated with Eddington bias is responsible for the enhancement. The magnitude of the enhancement increases with the steepness of the underlying flux density distribution, so that existing limits on the disparity in event rates between high and lowGalactic latitudes suggest that the FRB population has a steep differential flux density distribution, scaling as S−3.5 ν or steeper. Existing estimates of the event rate in the flux density range probed by the High Time Resolution Universe survey overestimate the true rate by a factor of ∼3.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call