Abstract

Advances in machine learning provide the ability to leverage data from expensive simulations of high-energy-density experiments to significantly cut down on computational time and costs associated with the search for optimal target designs. This study presents an application of cutting-edge Bayesian optimization methods to the one-dimensional (1D) design optimization of double shell graded layer targets for inertial confinement fusion experiments. This investigation attempts to reduce hydrodynamic instabilities while retaining high yields for future NIF experiments. Machine learning methods can use predictive physics simulations to identify graded layer designs from within the vast design space that demonstrate high predicted performance, including novel designs with high uncertainty in performance that may hold unexpected promise. By applying machine learning tools to the simulation design, we map the trade-off between 1D yield and instability, specifically isolating parameter ranges, which maintain high performance while showing significantly improved Rayleigh–Taylor stability over the point design. The groundwork laid in this study will be a useful design tool for future NIF experiments with graded layer targets.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call