Abstract

Data fusion between 3D vision sensors and radiological sensors can enable data improvements and novel applications for nuclear safeguards. While the radiological sensors allow for nuclear threat detection, the addition of a 3D vision sensor can allow for improved threat detection when using its ability to track objects in a scene. Ten measurements were taken that involved three to four people walking in a room where one of the persons carried a Cf-252 source in a backpack. A data-fusion algorithm was used to correlate the radiological and vision data. The vision trajectory with the highest correlation value was selected as the trajectory carrying the radiological material. Filtering and refining the radiological and vision data was also explored in search of improvements. For unaltered data, the data-fusion approach correctly identified to the source-carrying trajectory for all ten measurements in all cases except when using data where counting statistics were low or the signal-to-background ratio was low. Filtering and refining the data improved the correlation values for all trajectories as expected. The presented algorithm has proven to be an effective means of data fusion between the two different types of sensor data. These initial results show the effectiveness of incorporating 3D vision in radiological detection systems.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call