Abstract
Full detector simulation is known to consume a large proportion of computing resources available to the LHC experiments, and reducing time consumed by simulation will allow for more profound physics studies. There are many avenues to exploit, and in this work we investigate those that do not require changes in the GEANT4 simulation suite. In this study, several factors affecting the full GEANT4 simulation execution time are investigated. A broad range of configurations has been tested to ensure consistency of physical results. The effect of a single dynamic library GEANT4 build type has been investigated and the impact of different primary particles at different energies has been evaluated using GDML and GeoModel geometries. Some configurations have an impact on the physics results and are, therefore, excluded from further analysis. Usage of the single dynamic library is shown to increase execution time and does not represent a viable option for optimization. Lastly, the static build type is confirmed as the most effective method to reduce the simulation execution time.
Highlights
Particle physics has an ambitious experimental program for the coming decades: during the High-Luminosity Large Hadron Collider (HL-LHC) phase, scheduled to begin data taking in 2027, events will be collected at very high rates
This study is articulated in three main parts: 1. Validation to ensure that physics results are not affected by compiler-specific options
This study has shown that unsafe math optimizations as well as certain compilers, namely the Clang family and older GCC versions, may have a negative impact on the quality of the physics results
Summary
Particle physics has an ambitious experimental program for the coming decades: during the High-Luminosity Large Hadron Collider (HL-LHC) phase, scheduled to begin data taking in 2027, events will be collected at very high rates. The rate foreseen for the ATLAS experiment is 10 kHz, approximately ten times more than during previous runs [1, 2]. In addition to the experimental challenges of collecting, storing and analysing such a large volume of data, a comparable amount of Monte Carlo (MC) simulated data will be required in order to prevent simulation-dominated systematic uncertainties [3]. Approximately half of the MC events in ATLAS are produced with full simulations, i.e. using the GEANT4 simulation toolkit [4]. The reduction of the time spent on simulations is, a priority, and an active R&D program aimed at optimizing the GEANT4 CPU requirements is ongoing in ATLAS.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.