Abstract
Machine Learning techniques have been used in different applications by the HEP community: in this talk, we discuss the case of detector simulation. The need for simulated events, expected in the future for LHC experiments and their High Luminosity upgrades, is increasing dramatically and requires new fast simulation solutions. We describe an R&D activity aimed at providing a configurable tool capable of training a neural network to reproduce the detector response and speed-up standard Monte Carlo simulation. This represents a generic approach in the sense that such a network could be designed and trained to simulate any kind of detector and, eventually, the whole data processing chain in order to get, directly in one step, the final reconstructed quantities, in just a small fraction of time. We present the first application of three-dimensional convolutional Generative Adversarial Networks to the simulation of high granularity electromagnetic calorimeters. We describe detailed validation studies comparing our results to Geant4 Monte Carlo simulation. Finally we show how this tool could be generalized to describe a whole class of calorimeters, opening the way to a generic machine learning based fast simulation approach.
Highlights
The future High Luminosity LHC phase will be very demanding in terms of computing resources, because of the amount and complexity of data that will be collected, stored and analysed [1]: correspondingly, the need for simulated data is expected to grow by one order of magnitude
Our current study focuses on the electromagnetic calorimeter (ECAL)
Following the strategy of auxiliary classifier Generative Adversarial Networks (GANs) [12], indicating that the introduction of labels provides faster convergence and stability, we assign to the discriminator two additional regression tasks: an estimation of the incoming particle energy (Ep) and of the total energy measured by the calorimeter (Ecal), corresponding to the sum of all the energy depositions in the cells of the image
Summary
The future High Luminosity LHC phase will be very demanding in terms of computing resources, because of the amount and complexity of data that will be collected, stored and analysed [1]: correspondingly, the need for simulated data is expected to grow by one order of magnitude. For this reason, High Energy Physics (HEP) software and, in particular, Monte Carlo-based simulation, are going through an important phase of restructuring and optimisation for new computing architectures.
Published Version (Free)
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have