Abstract

Machine learning has emerged as a powerful solution to the modern challenges in accelerator physics. However, the limited availability of beam time, the computational cost of simulations, and the high dimensionality of optimization problems pose significant challenges in generating the required data for training state-of-the-art machine learning models. In this work, we introduce heetah, a yorch-based high-speed differentiable linear beam dynamics code. heetah enables the fast collection of large datasets by reducing computation times by multiple orders of magnitude and facilitates efficient gradient-based optimization for accelerator tuning and system identification. This positions heetah as a user-friendly, readily extensible tool that integrates seamlessly with widely adopted machine learning tools. We showcase the utility of heetah through five examples, including reinforcement learning training, gradient-based beamline tuning, gradient-based system identification, physics-informed Bayesian optimization priors, and modular neural network surrogate modeling of space charge effects. The use of such a high-speed differentiable simulation code will simplify the development of machine learning-based methods for particle accelerators and fast-track their integration into everyday operations of accelerator facilities. Published by the American Physical Society 2024

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call