Abstract

The enormous advances in sensing and data processing technologies in combination with recent developments in nuclear radiation detection and imaging enable unprecedented and “smarter” ways to detect, map, and visualize nuclear radiation. The recently developed concept of three-dimensional (3-D) Scene-data fusion allows us now to “see” nuclear radiation in three dimensions, in real time, and specific to radionuclides. It is based on a multi-sensor instrument that is able to map a local scene and to fuse the scene data with nuclear radiation data in 3-D while the instrument is freely moving through the scene. This new concept is agnostic of the deployment platform and the specific radiation detection or imaging modality. We have demonstrated this 3-D Scene-data fusion concept in a range of configurations in locations, such as the Fukushima Prefecture in Japan or Chernobyl in Ukraine on unmanned and manned aerial and ground-based platforms. It provides new means in the detection, mapping, and visualization of radiological and nuclear materials relevant for the safe and secure operation of nuclear and radiological facilities or in the response to accidental or intentional releases of radioactive materials where a timely, accurate, and effective assessment is critical. In addition, the ability to visualize nuclear radiation in 3-D and in real time provides new means in the communication with public and facilitates to overcome one of the major public concerns of not being able to “see” nuclear radiation.

Highlights

  • Three-dimensional (3-D) gamma-ray and X-ray vision has been mainly the matter of science fiction and comic books in the past

  • We will show more relevant examples from measurements around the world performed with radiation detection and imaging instruments in combination with other contextual sensors, light detection and ranging (LiDAR)

  • While we initially developed the concept with maximization image reconstruction (MS)-Kinect because of its low cost, capabilities, and available open software to access the data from the structured light and visual camera, all measurements we have performed since 2015 were either with LiDAR or with visual camera-based photogrammetry

Read more

Summary

Introduction

Three-dimensional (3-D) gamma-ray and X-ray vision has been mainly the matter of science fiction and comic books in the past. Computer vision is at the core of numerous technologies to automate, analyze, learn, or control processes or features; nuclear detection and imaging is at the core for the detection and mapping of radioactive materials, relevant for applications ranging from medicine and physics to nuclear security and safety. The combination of both complementary technologies enables unprecedented and “smarter”. Scene-data fusion (SDF) represents the realization of this combination, by integrating “contextual” sensors, such as visual cameras or light detection and ranging (LiDAR), and nuclear radiation detection and imaging instruments [1,2,3,4,5,6].

Methods
Results
Conclusion
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call