Abstract

Detection of radioactive sources is an important capability that led to the deployment of networks for detecting and localizing low-level, hazardous radiation sources. It is generally expected that such networks outperform the individual detectors by intelligently fusing information from several dispersed sensors. In this paper, we develop a network detection method that uses the slope of a linear regression fit as a test statistic for detecting a point radioactive source within a field of detectors. In our regression model, we compute a least-squared linear fit between the average radiation counts at the detectors and the inverse-squared distances of known detector locations to an estimated source location. We show that the slope of this regression fit is an estimate of the source intensity and can be used as a threshold for source detection purposes. We compare performance of our proposed detection method with that of a fusion-based Sequential Probability Ratio Test (SPRT) method. For performance analyses, two datasets from the Domestic Nuclear Detection Ofces Intelligence Radiation Sensors Systems (IRSS) outdoor tests are used. Each of these tests consists of several runs of a single radioactive source moving in and out of a detector network. We present receiver operating characteristic (ROC) curves and optimal threshold values for the performance each detection method, and determine that our detection method using linear regression fit has slightly better overall performance than SPRT method.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call