3D asymmetries are major degradation mechanisms in inertial-confinement fusion implosions at the National Ignition Facility (NIF). These asymmetries can be diagnosed and reconstructed with the neutron imaging system (NIS) on three lines of sight around the NIF target chamber. Conventional tomographic reconstructions are used to reconstruct the 3D morphology of the implosion using NIS [Volegov et al., J. Appl. Phys. 127, 083301 (2020)], but the problem is ill-posed with only three imaging lines of sight. Asymmetries can also be diagnosed with the real-time neutron activation diagnostics (RTNAD) and the neutron time-of-flight (nToF) suite. Since the NIS, RTNAD, and nToF each sample a different part of the implosion using different physical principles, we propose that it is possible to overcome the limitations of too few imaging lines of sight by performing 3D reconstructions that combine information from all three heterogeneous data sources. This work presents a new machine learning-based reconstruction technique to do just this. By using a simple physics model and group of neural networks to map 3D morphologies to data, this technique can easily account for data of multiple different types. A simple proof-of-principle is presented, demonstrating that this technique can accurately reconstruct a hot-spot shape using synthetic primary neutron images and a hot-spot velocity vector. In particular, the hot-spot's asymmetry, quantified as spherical harmonic coefficients, is reconstructed to within ±4% of the radius in 90% of test cases. In the future, this technique will be applied to actual NIS, RTNAD, and nToF data to better understand 3D asymmetries at the NIF.
Read full abstract