Abstract

The use of Ring Imaging Cherenkov detectors (RICH) offers a powerful technique for identifying the particle species in particle physics. These detectors produce 2D images formed by rings of individual photons superimposed on a background of photon rings from other particles.The RICH particle identification (PID) is essential to the LHCb experiment at CERN. While the current PID algorithm has performed well during LHC data-taking periods between 2010 to 2018, its complexity poses a challenge for LHCb computing infrastructure upgrades towards multi-core architectures. The high particle multiplicity environment of future LHC runs strongly motivates shifting towards high-throughput computing for the online event reconstruction.In this contribution, we introduce a convolutional neural network (CNN) approach to particle identification in LHCb RICH. The CNN takes binary input images from the two RICH detectors to classify particle species. The input images are polar-transformed sub-sections of the RICH photon-detection planes. The model is hyperparameter-optimised and trained on classification accuracy with simulated collision data for the upcoming LHC operation starting in 2022. The PID performance of the CNN is comparable to the conventional algorithm, and its simplicity renders it suitable for fast online reconstruction through parallel processing.We show that under conditions of reduced combinatorial background, as expected from the introduction of timing resolution to the RICH detectors in future upgrades, the network achieves a particle identification performance close to 100 %, with simultaneous misclassification of the most prevalent particle species approaching 0 %.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.