A Downstream and Vertexing Algorithm for Long Lived Particles (LLP) Selection at the First High Level Trigger (HLT1) of LHCb
A new algorithm has been developed at LHCb which is able to reconstruct and select very displaced vertices in real time at the first level of the trigger (HLT1). It makes use of the Upstream Tracker (UT) and the Scintillator Fiber detector (SciFi) of LHCb and it is executed on GPUs inside the Allen framework. In addition to an optimized strategy, it utilizes a Neural Network (NN) implementation to increase the track efficiency and reduce the ghost rates, with very high throughput and limited time budget. Besides serving to reconstruct KS0 and Λ particles from the Standard Model, the Downstream algorithm and the associated two-track vertexing could largely increase the LHCb physics potential for detecting long-lived particles during the Run 3.
19
- 10.1007/978-3-540-70529-1_374
- Jan 1, 2015
1
- 10.1140/epjc/s10052-024-13686-6
- Jan 6, 2025
- The European Physical Journal C
1215
- 10.1088/1748-0221/3/08/s08005
- Aug 1, 2008
- Journal of Instrumentation
28
- 10.1128/aem.17.6.825-829.1969
- Jan 1, 1969
- Applied Microbiology
11
- 10.21468/scipostphyscodeb.8-r8.3
- Nov 10, 2022
- SciPost Physics Codebases
18
- 10.1088/1748-0221/19/05/p05065
- May 1, 2024
- Journal of Instrumentation
3
- 10.1016/j.cpc.2020.107713
- Nov 10, 2020
- Computer Physics Communications
57
- 10.1007/s41781-020-00039-7
- Apr 30, 2020
- Computing and Software for Big Science
8
- 10.1140/epjc/s10052-024-12906-3
- Jun 12, 2024
- The European Physical Journal C
21196
- 10.1016/s0168-9002(03)01368-8
- Jun 11, 2003
- Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment
- Research Article
295
- 10.1088/1361-6633/ab28d6
- Oct 1, 2019
- Reports on Progress in Physics
We examine the theoretical motivations for long-lived particle (LLP) signals at the LHC in a comprehensive survey of standard model (SM) extensions. LLPs are a common prediction of a wide range of theories that address unsolved fundamental mysteries such as naturalness, dark matter, baryogenesis and neutrino masses, and represent a natural and generic possibility for physics beyond the SM (BSM). In most cases the LLP lifetime can be treated as a free parameter from the m scale up to the Big Bang Nucleosynthesis limit of m. Neutral LLPs with lifetimes above 100 m are particularly difficult to probe, as the sensitivity of the LHC main detectors is limited by challenging backgrounds, triggers, and small acceptances. MATHUSLA is a proposal for a minimally instrumented, large-volume surface detector near ATLAS or CMS. It would search for neutral LLPs produced in HL-LHC collisions by reconstructing displaced vertices (DVs) in a low-background environment, extending the sensitivity of the main detectors by orders of magnitude in the long-lifetime regime. We study the LLP physics opportunities afforded by a MATHUSLA-like detector at the HL-LHC, assuming backgrounds can be rejected as expected. We develop a model-independent approach to describe the sensitivity of MATHUSLA to BSM LLP signals, and compare it to DV and missing energy searches at ATLAS or CMS. We then explore the BSM motivations for LLPs in considerable detail, presenting a large number of new sensitivity studies. While our discussion is especially oriented towards the long-lifetime regime at MATHUSLA, this survey underlines the importance of a varied LLP search program at the LHC in general. By synthesizing these results into a general discussion of the top–down and bottom-up motivations for LLP searches, it is our aim to demonstrate the exceptional strength and breadth of the physics case for the construction of the MATHUSLA detector.
- Research Article
6
- 10.3389/fdata.2022.1008737
- Nov 7, 2022
- Frontiers in Big Data
Long-lived particles (LLPs) show up in many extensions of the Standard Model, but they are challenging to search for with current detectors, due to their very displaced vertices. This study evaluated the ability of the trigger algorithms used in the Large Hadron Collider beauty (LHCb) experiment to detect long-lived particles and attempted to adapt them to enhance the sensitivity of this experiment to undiscovered long-lived particles. A model with a Higgs portal to a dark sector is tested, and the sensitivity reach is discussed. In the LHCb tracking system, the farthest tracking station from the collision point is the scintillating fiber tracker, the SciFi detector. One of the challenges in the track reconstruction is to deal with the large amount of and combinatorics of hits in the LHCb detector. A dedicated algorithm has been developed to cope with the large data output. When fully implemented, this algorithm would greatly increase the available statistics for any long-lived particle search in the forward region and would additionally improve the sensitivity of analyses dealing with Standard Model particles of large lifetime, such as or Λ0 hadrons.
- Conference Article
- 10.22323/1.340.0574
- Aug 2, 2019
Many models of new physics beyond the Standard Model are able to describe massive, long-lived particles with macroscopic decays, which can be reconstructed as displaced vertices inside the inner trackers of the LHC experiments. In addition, the lack of evidence of any new physics at the LHC motivates to perform more unconventional searches, such as looking for displaced vertices. I comment on the 13 TeV LHC reach with a proposed multitrack displaced vertex search strategy to probe light sterile neutrinos. Limits on active-sterile neutrino mixing are presented.
- Research Article
7
- 10.1007/jhep12(2020)061
- Dec 1, 2020
- Journal of High Energy Physics
MATHUSLA is a proposed large-volume displaced vertex (DV) detector, situated on the surface above CMS and designed to search for long-lived particles (LLPs) produced at the HL-LHC. We show that a discovery of LLPs at MATHUSLA would not only prove the existence of BSM physics, it would also uncover the theoretical origin of the LLPs, despite the fact that MATHUSLA gathers no energy or momentum information on the LLP decay products. Our analysis is simple and robust, making it easily generalizable to include more complex LLP scenarios, and our methods are applicable to LLP decays discovered in ATLAS, CMS, LHCb, or other external detectors. In the event of an LLP detection, MATHUSLA can act as a Level-1 trigger for the main detector, guaranteeing that the LLP production event is read out at CMS. We perform an LLP simplified model analysis to show that combining information from the MATHUSLA and CMS detectors would allow the LLP production mode topology to be determined with as few as ∼ 100 observed LLP decays. Underlying theory parameters, like the LLP and parent particle masses, can also be measured with ≲ 10% precision. Together with information on the LLP decay mode from the geometric properties of the observed DV, it is clear that MATHUSLA and CMS together will be able to characterize any newly discovered physics in great detail.
- Research Article
6
- 10.1007/jhep09(2021)154
- Sep 1, 2021
- Journal of High Energy Physics
We propose a program at B-factories of inclusive, multi-track displaced vertex searches, which are expected to be low background and give excellent sensitivity to non-minimal hidden sectors. Multi-particle hidden sectors often include long-lived particles (LLPs) which result from approximate symmetries, and we classify the possible decays of GeV-scale LLPs in an effective field theory framework. Considering several LLP production modes, including dark photons and dark Higgs bosons, we study the sensitivity of LLP searches with different number of displaced vertices per event and track requirements per displaced vertex, showing that inclusive searches can have sensitivity to a large range of hidden sector models that are otherwise unconstrained by current or planned searches.
- Research Article
5
- 10.1103/physrevd.106.095012
- Nov 9, 2022
- Physical Review D
In this paper, we point out a novel signature of physics beyond the Standard Model which could potentially be observed both at the Large Hadron Collider (LHC) and at future colliders. This signature, which emerges naturally within many proposed extensions of the Standard Model, results from the multiple displaced vertices associated with the successive decays of unstable, long-lived particles along the same decay chain. We call such a sequence of displaced vertices a ``tumbler.'' We examine the prospects for observing tumblers at the LHC and assess the extent to which tumbler signatures can be distinguished from other signatures of new physics which also involve multiple displaced vertices within the same collider event. As part of this analysis, we also develop a procedure for reconstructing the masses and lifetimes of the particles involved in the corresponding decay chains. We find that the prospects for discovering and distinguishing tumblers can be greatly enhanced by exploiting precision timing information such as would be provided by the CMS timing layer at the high-luminosity LHC. Our analysis therefore provides strong additional motivation for continued efforts to improve the timing capabilities of collider detectors at the LHC and beyond.
- Research Article
33
- 10.1007/jhep09(2017)076
- Sep 1, 2017
- Journal of High Energy Physics
We propose a systematic programme to search for long-lived neutral particle signatures through a minimal set of displaced searches (dMETs). Our approach is to extend the well-established dark matter simplified models to include displaced vertices. The dark matter simplified models are used to describe the primary production vertex. A displaced secondary vertex, characterised by the mass of the long-lived particle and its lifetime, is added for the displaced signature. We show how these models can be motivated by, and mapped onto, complete models such as gauge-mediated SUSY breaking and models of neutral naturalness. We also outline how this approach may be used to extend other simplified models to incorporate displaced signatures and to characterise searches for long-lived charged particles. Displaced vertices are a striking signature which is often virtually background free, and thus provide an excellent target for the high-luminosity run of the Large Hadron Collider. The proposed models and searches provide a first step towards a systematic broadening of the displaced dark matter search programme.
- Conference Article
- 10.22323/1.276.0104
- Sep 19, 2016
Long-lived particles are contained in a variety of beyond Standard Model theories, including supersymmetric models, universal extra dimensions, or technicolor theories. If the lifetime of such a particle is long enough, the particle can enter or even pass through the detector before it decays. Therefore, searches for long-lived particles require a very different search strategy compared to conventional searches for particles beyond the Standard Model.If the new particle is not only weakly interacting, the particle can be reconstructed itself and not only via its decay products.A very specific characteristic of such new heavy charged particles is their large ionization losses when traveling through the detector.This article summarizes searches for long-lived particles at the CMS and ATLAS experiments that exploit the potentially high ionization losses per path length (dE/dx) of the new particle.The presented searches are performed on 8 and/or 13 TeV data. Additionally, an overview of the methodology of dE/dx measurements at the CMS and ATLAS experiments is given. Presented at LHCP2016 Fourth annual Large Hadron Collider Physics Searches for long-lived and highly-ionizing particles at the CMS and ATLAS experiments Teresa Lenz∗ (on behalf of the ATLAS and CMS collaborations) Hamburg University E-mail: teresa.lenz@desy.de Long-lived particles are contained in a variety of beyond Standard Model theories, including supersymmetric models, universal extra dimensions, or technicolor theories. If the lifetime of such a particle is long enough, the particle can enter or even pass through the detector before it decays. Therefore, searches for long-lived particles require a very different search strategy compared to conventional searches for particles beyond the Standard Model. If the new particle is not only weakly interacting, the particle can be reconstructed itself and not only via its decay products. A very specific characteristic of such new heavy charged particles is their large ionization losses when traveling through the detector. This article summarizes searches for long-lived particles at the CMS and ATLAS experiments that exploit the potentially high ionization losses per path length (dE/dx) of the new particle. The presented searches are performed on 8 and/or 13 TeV data. Additionally, an overview of the methodology of dE/dx measurements at the CMS and ATLAS experiments is given. Fourth Annual Large Hadron Collider Physics 13-18 June 2016 Lund, Sweden
- Research Article
- 10.1051/epjconf/202431501033
- Jan 1, 2024
- EPJ Web of Conferences
Future e+e− colliders provide a unique opportunity for long-lived particle (LLP) searches. This study focusses on LLP searches using the International Large Detector (ILD), a detector concept for a future Higgs factory. The signature considered is a displaced vertex inside the ILD’s Time Projection Chamber. We study challenging scenarios involving small mass splittings between heavy LLP and dark matter, resulting in soft displaced tracks. As an opposite case, we explore light pseudoscalar LLPs decaying to boosted, nearly collinear tracks. Backgrounds from beam-induced processes and physical events are considered. Various tracking system designs and their impact on LLP reconstruction are discussed. Assuming a single displaced vertex signature, model-independent limits on signal production cross section are presented for a range of LLP lifetimes, masses, and mass splittings. The limits can be used for constraining specific models, with more complex displaced vertex signatures.
- Research Article
- 10.1088/1748-0221/20/02/c02051
- Feb 1, 2025
- Journal of Instrumentation
The Belle II experiment at the asymmetric-energy electron positron collider SuperKEKB aims to explore physics beyond the standard model (BSM). One of the widely discussed signatures for BSM are new, long-lived neutral particles, which decay into charged mesons or lepton pairs originating from a vertex usually far from the interaction point (IP) of the colliding beams. The current level-1 (L1) track trigger system is optimized for particles created at the IP and will thus reject, with high probability, such interesting events. This makes it necessary to develop a special L1 track trigger for events with a vertex displaced from IP, the Displaced Vertex Trigger (DVT). The pipelined and deadtime-free L1 trigger system of Belle II utilizes a set of FPGA boards to make rapid decisions within 5 microseconds. The new DVT will be operating in parallel with the existing track trigger systems. The DVT identifies events, where two tracks with opposite charge originate from a common vertex away from the IP. The track finding is done by a set of Hough transformations, each one assuming a certain track origin from a grid spanning the tracking volume. The correct vertex is then determined via a shape analysis of the Hough clusters, using neural networks. To manage the large number of track origin hypotheses, a pre-selection of candidates based on the properties of the Hough map is employed, which significantly reduces the required FPGA resources.
- Conference Article
7
- 10.1109/saci.2007.375494
- May 1, 2007
The hardware implementation of neural networks is a new step in the evolution and use of neural networks in practical applications. The CMAC cerebellar model articulation controller is intended especially for hardware implementation, and this type of network is used successfully in the areas of robotics and control, where the real time capabilities of the network are of particular importance. The implementation of neural networks on FPGA's has several benefits, with emphasis on parallelism and the real time capabilities. This paper discusses the hardware implementation of the CMAC type neural network, the architecture and parameters and the functional modules of the hardware implemented neuro-processor.
- Research Article
- 10.1007/jhep02(2025)149
- Feb 21, 2025
- Journal of High Energy Physics
Searching for long-lived particles (LLPs) beyond the Standard Model (SM) is a promising direction in collider experiments. The Georgi-Machacek (GM) model extends the scalar sector in the SM by introducing various new scalar bosons. In this study, we focus on the parameter space that allows the light doubly charged scalar to become long-lived. This light doubly charged scalar is fermophobic and predominantly decays into a pair of on-shell or off-shell same-sign W bosons. We investigate three types of signal signatures at the LHC: displaced vertices in the inner tracking detector, displaced showers in the muon system, and heavy stable charged particles. Additionally, we analyze the potential for detecting such doubly charged scalars in far detectors, including ANUBIS, MATHUSLA, FACET, FASER, CODEX-b, MoEDAL-MAPP and AL3X. By combining the LLP searches at the LHC and in far detectors, we project that the limits on the mixing angle, θH, (between the doublet and triplets) can cover most of the parameter space with sin θH ≲ 10−3 for the mass of doubly charged scalar starting from 50 GeV to 1050 GeV, assuming the full integrated luminosity at the LHC and HL-LHC which is complementary with the usual measurements at the LHC.
- Research Article
10
- 10.1007/jhep11(2021)229
- Nov 1, 2021
- Journal of High Energy Physics
A novel search for exotic decays of the Higgs boson into pairs of long-lived neutral particles, each decaying into a bottom quark pair, is performed using 139 fb−1 of sqrt{s} = 13 TeV proton-proton collision data collected with the ATLAS detector at the LHC. Events consistent with the production of a Higgs boson in association with a leptonically decaying Z boson are analysed. Long-lived particle (LLP) decays are reconstructed from inner-detector tracks as displaced vertices with high mass and track multiplicity relative to Standard Model processes. The analysis selection requires the presence of at least two displaced vertices, effectively suppressing Standard Model backgrounds. The residual background contribution is estimated using a data-driven technique. No excess over Standard Model predictions is observed, and upper limits are set on the branching ratio of the Higgs boson to LLPs. Branching ratios above 10% are excluded at 95% confidence level for LLP mean proper lifetimes cτ as small as 4 mm and as large as 100 mm. For LLP masses below 40 GeV, these results represent the most stringent constraint in this lifetime regime.
- Conference Article
15
- 10.1109/pdp.2011.33
- Feb 1, 2011
This paper focuses on solving large size optimization problems using GPGPU. Evolutionary Algorithms for solving these optimization problems suffer from the curse of dimensionality, which implies that their performance deteriorates as quickly as the dimensionality of the search space increases. This difficulty makes very challenging the performance studies for very high dimensional problems. Furthermore, these studies deal with a limited time-budget. The availability of low cost powerful parallel graphics cards has stimulated the implementation of diverse algorithms on Graphics Processing Units (GPU). In this paper, the design of a GPGPU-based Parallel Particle Swarm Algorithm, to tackle this type of problem maintaining a limited execution time budget, is described. This implementation profits of an efficient mapping of the data elements (swarm of very high dimensional particles) to the parallel processing elements of the GPU. In this problem, the fitness evaluation is the most CPU-costly routine, and therefore the main candidate to be implemented on GPU. As main conclusion, the speed-up curve versus the increase in dimensionality is shown. This curve indicates an asymptotic limit stemmed from the data-parallel mapping.
- Conference Article
2
- 10.1109/rtc.2016.7543097
- Jun 1, 2016
ALICE (A Large Heavy Ion Experiment) is one of four major experiments at the Large Hadron Collider (LHC) at CERN. The ALICE High Level Trigger (HLT) is a cluster of 200 nodes, which reconstructs collisions as recorded by the ALICE detector in real-time. It employs a custom online data-transport framework to distribute data and workload among the compute nodes. ALICE employs subdetectors sensitive to environmental conditions such as pressure and temperature, e.g. the Time Projection Chamber (TPC). A precise reconstruction of particle trajectories requires the calibration of these detectors. Performing the calibration in real time in the HLT improves the online reconstructions and renders certain offline calibration steps obsolete speeding up offline physics analysis. For LHC Run 3, starting in 2020 when data reduction will rely on reconstructed data, online calibration becomes a necessity. Reconstructed particle trajectories build the basis for the calibration making a fast online-tracking mandatory. The main detectors used for this purpose are the TPC and ITS. Reconstructing the trajectories in the TPC is the most compute-intense step. We present several components of the ALICE High Level Trigger used for fast event reconstruction and then focus on newly developed components for online calibration. The TPC tracker employs GPUs to speed up the processing and is based on a Cellular Automaton and the Kalman filter. It has been used successfully in proton-proton, lead-lead, and proton-lead runs between 2011 and 2015. We have implemented a wrapper to run ALICE offline analysis and calibration software inside the HLT. Normally, the HLT works in an event-synchronous mode. We have added asynchronous processing capabilities to support long-running calibration tasks. In order to improve the resiliency, an isolated process performs the asynchronous operations such that even a fatal error does not disturb data taking. We have complemented the original loop-free HLT chain with ZeroMQ data-transfer components. The ZeroMQ components facilitate a feedback loop, that after a short delay inserts the calibration result created at the end of the chain back into tracking components at the beginning of the chain. On top of that, these components are used to ship QA histograms to the Data Quality Monitoring (DQM) and to obtain information of pressure and temperature sensors needed for calibration. All these new features are implemented in a general way, such that they have use-cases aside from online calibration. In order to gather sufficient statistics for the calibration, the asynchronous calibration component must process enough events per time interval. Since the calibration is only valid for a certain time period, the delay until the feedback loop provides updated calibration data must not be too long. A first full-scale test of the online calibration functionality was performed during the 2015 heavy-ion run under real conditions. We present a timing analysis of this first online-calibration test, which indicates that the HLT is capable of online TPC drift time calibration fast enough to calibrate the tracking via the feedback loop.
- New
- Research Article
- 10.1007/s41781-025-00148-1
- Nov 5, 2025
- Computing and Software for Big Science
- Research Article
- 10.1007/s41781-025-00146-3
- Oct 21, 2025
- Computing and Software for Big Science
- Research Article
- 10.1007/s41781-025-00133-8
- Jul 13, 2025
- Computing and Software for Big Science
- Research Article
- 10.1007/s41781-025-00143-6
- Jul 1, 2025
- Computing and Software for Big Science
- Research Article
- 10.1007/s41781-025-00142-7
- Jul 1, 2025
- Computing and Software for Big Science
- Research Article
- 10.1007/s41781-025-00140-9
- Jul 1, 2025
- Computing and Software for Big Science
- Research Article
- 10.1007/s41781-025-00141-8
- Jul 1, 2025
- Computing and Software for Big Science
- Research Article
- 10.1007/s41781-025-00137-4
- May 22, 2025
- Computing and Software for Big Science
- Research Article
- 10.1007/s41781-025-00138-3
- May 22, 2025
- Computing and Software for Big Science
- Research Article
- 10.1007/s41781-025-00139-2
- May 21, 2025
- Computing and Software for Big Science
- Ask R Discovery
- Chat PDF
AI summaries and top papers from 250M+ research sources.