Abstract
Isotopic ratios of radioxenon captured in the atmosphere can be indicators of the occurrence of an underground nuclear explosion. However, civilian sources of xenon isotopes, such as medical isotope production facilities and nuclear reactors, can interfere with detection of signals associated with nuclear testing, according to a standard model of the evolution of radioxenon isotopic abundances in a nuclear explosion cavity. We find that this standard model is idealized by not including the effects of physical processes resulting in the partitioning of the radionuclide inventory between a gas phase and rock melt created by the detonation and by ignoring seepage or continuous leakage of gases from the cavity or zone of collapse. Application of more realistic assumptions about the state of the detonation cavity results in isotopic activity ratios that differ from the civilian background more than the idealized standard model suggests, while also reducing the quantity of radioxenon available for atmospheric release and subsequent detection. Our simulations indicate that the physical evolution of the detonation cavity during the post-detonation partitioning process strongly influences isotopic evolution in the gas phase. Collapse of the cavity potentially has the greatest effect on partitioning of the refractory fission products that are precursors to radioxenon. The model allows for the possibility that post-detonation seismicity can be used to predict isotopic evolution.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.