Performance Portability of the Particle Tracking Algorithm Using SYCL

  • Abstract
  • References
  • Similar Papers
Abstract
Translate article icon Translate Article Star icon
Take notes icon Take Notes

With fast advancements in detector and software technologies used in large-scale physics experiments, the requirements for the performance of computing systems used for both online and offline data processing have grown drastically. The industry offers a wide range of hardware devices to select from when designing such systems, but in turn, this imposes a technical challenge. To achieve efficient device utilization, deep knowledge of a particular technology is required. While one can achieve a high optimization level, it yields a hard-to-maintain codebase with limited upgradeability or capability to migrate to other platforms. It becomes a significant challenge, especially if available human resources are limited. In this paper, we present the application of the SYCL heterogeneous programming model that can help overcome those drawbacks. By the introduction of an abstraction layer, the source code is decoupled from the computing device architecture, and the developer can select the compilation target. The same codebase can, therefore, be executed on any supported hardware platform. We use the particle track reconstruction algorithm developed for the Forward Tracker in the P¯ANDA experiment to demonstrate portability between various computing architectures and performance evaluation of the solution.

ReferencesShowing 10 of 26 papers
  • Open Access Icon
  • Cite Count Icon 6
  • 10.1590/s0074-02761990000400018
Respiratory syncytial virus: occurrence of subgroups A and B strains in Rio de Janeiro
  • Dec 1, 1990
  • Memórias do Instituto Oswaldo Cruz
  • Marilda Mendonça Siqueira + 1 more

  • Open Access Icon
  • PDF Download Icon
  • Cite Count Icon 3
  • 10.1051/epjconf/202429505022
The O2 software framework and GPU usage in ALICE online and offline reconstruction in Run 3
  • Jan 1, 2024
  • EPJ Web of Conferences
  • Giulio Eulisse + 1 more

  • Open Access Icon
  • PDF Download Icon
  • Cite Count Icon 54
  • 10.1140/epja/s10050-021-00475-y
PANDA Phase One
  • Jun 1, 2021
  • The European Physical Journal A
  • G Barucca + 99 more

  • Open Access Icon
  • Cite Count Icon 16
  • 10.1080/10506890600579868
Exploring the Mysteries of Strong Interactions—The PANDA Experiment
  • Apr 1, 2006
  • Nuclear Physics News
  • Kai-Thomas Brinkmann + 2 more

  • Open Access Icon
  • Cite Count Icon 6
  • 10.1016/j.cpc.2019.107029
Towards Lattice Quantum Chromodynamics on FPGA devices
  • Nov 11, 2019
  • Computer Physics Communications
  • Grzegorz Korcyl + 1 more

  • Open Access Icon
  • Cite Count Icon 271
  • 10.1109/tpds.2021.3097283
Kokkos 3: Programming Model Extensions for the Exascale Era
  • Apr 1, 2022
  • IEEE Transactions on Parallel and Distributed Systems
  • Christian R Trott + 19 more

  • Open Access Icon
  • Cite Count Icon 4
  • 10.1007/978-3-030-78713-4
High Performance Computing
  • Jan 1, 2021
  • Bradford L Chamberlain + 3 more

  • Cite Count Icon 61
  • 10.1103/revmodphys.82.1419
Track and vertex reconstruction: From classical to adaptive methods
  • May 7, 2010
  • Reviews of Modern Physics
  • Are Strandlie + 1 more

  • Cite Count Icon 2932
  • 10.1109/99.660313
OpenMP: an industry standard API for shared-memory programming
  • Jan 1, 1998
  • IEEE Computational Science and Engineering
  • L Dagum + 1 more

  • Open Access Icon
  • Cite Count Icon 57
  • 10.1007/s41781-020-00039-7
Allen: A High-Level Trigger on GPUs for LHCb
  • Apr 30, 2020
  • Computing and Software for Big Science
  • R Aaij + 28 more

Similar Papers
  • Research Article
  • Cite Count Icon 1
  • 10.1103/physreve.105.044608
Correlation Tracking: Using simulations to interpolate highly correlated particle tracks.
  • Apr 25, 2022
  • Physical review. E
  • Ella M King + 4 more

Despite significant advances in particle imaging technologies over the past two decades, few advances have been made in particle tracking, i.e., linking individual particle positions across time series data. The state-of-the-art tracking algorithm is highly effective for systems in which the particles behave mostly independently. However, these algorithms become inaccurate when particle motion is highly correlated, such as in dense or strongly interacting systems. Accurate particle tracking is essential in the study of the physics of dense colloids, such as the study of dislocation formation, nucleation, and shear transformations. Here, we present a method for particle tracking that incorporates information about the correlated motion of the particles. We demonstrate significant improvement over the state-of-the-art tracking algorithm in simulated data on highly correlated systems.

  • Research Article
  • Cite Count Icon 11
  • 10.1016/j.expthermflusci.2020.110346
3D Lagrangian particle tracking of a subsonic jet using multi-pulse Shake-The-Box
  • Jan 19, 2021
  • Experimental Thermal and Fluid Science
  • Peter Manovski + 7 more

3D Lagrangian particle tracking of a subsonic jet using multi-pulse Shake-The-Box

  • Conference Article
  • 10.23919/mixdes.2018.8443602
Device Support for Giga-sampling Digitizers
  • Jun 1, 2018
  • Michal Basiuras + 4 more

In many Large-Scale Physics Experiments (LSPE), such as ITER, an efficient data acquisition and processing is required. The system uses fast Analog-to-Digital Converters (ADC) to acquire data with performance of gigabits or tens of gigabits per second. Software provided by a manufacturer demonstrates basic capabilities of the hardware but it cannot be used for the demanding applications of LSPE. Therefore, the need arises to create custom software for data acquisition, communication and processing. In the process of development of such software dedicated Control System Frameworks (CSF) are commonly used. The ultimate goal is to create a universal framework for high-speed ADCs that will be easily scalable to wide range of devices and will offer high performance that is needed in LSPE. Such framework needs to support a scalable architecture that allows processing data from many channels distributed among complex machine and offer human-friendly operator panel for device configuration. The paper presents data acquisition software developed for 5 GS/s pulse digitizer based on the DRS4 ASIC. The software is based on the Experimental Physics and Industrial Control System (EPICS). Operator Panels were created with the use of the Best OPI Yet (BOY). The purpose of this paper is to describe the process of development of such software highlighting highperformance problems.

  • Research Article
  • Cite Count Icon 1
  • 10.1088/1748-0221/18/12/p12012
A generic node identification and routing algorithm in a distributed data acquisition platform: D-Matrix
  • Dec 1, 2023
  • Journal of Instrumentation
  • Zhengyang Sun + 5 more

Data acquisition (DAQ) systems are vital components in large-scale physics experiments. For communication, control and tracking purposes, all system components must be unambiguously identified. D-Matrix, as a generic distributed stream processing DAQ platform, accommodates various device connection methods, including networking, PCIe bus, point-to-point optical fiber links, etc. Although there are mature node identification solutions for each connection method individually, the uniform solution applicable to all connection methods remains crucial in a generic DAQ platform. D-Matrix abstracts a unified Multiple Point-to-Point transmission model (MPP model), supporting various physical connection methods and enabling multiple communication channels on a single physical link. Referring to the tree universal address system, a generic automatic node identification algorithm is proposed based on the MPP model. With simple configuration, this algorithm enables automatic node traversal, yielding a routing-based identification result that supports clustering and hierarchical node management requirements common in large-scale physics experiments. This paper explains the details of the algorithm and presents an example of the DAQ system based on the algorithm.

  • Research Article
  • 10.1088/1748-0221/20/05/p05041
AI-based particle track identification in scintillating fibres read out with imaging sensors
  • May 1, 2025
  • Journal of Instrumentation
  • Noemi Bührer + 4 more

This paper presents the development and application of an AI-based method for particle track identification using scintillating fibres read out with imaging sensors. We propose a variational autoencoder (VAE) to efficiently filter and identify frames containing signal from the substantial data generated by SPAD array sensors. Our VAE model, trained on purely background frames, demonstrated a high capability to distinguish frames containing particle tracks from background noise. The performance of the VAE-based anomaly detection was validated with experimental data, demonstrating the method's ability to efficiently identify relevant events with rapid processing time, suggesting a solid prospect for deployment as a fast inference tool on hardware for real-time anomaly detection. This work highlights the potential of combining advanced sensor technology with machine learning techniques to enhance particle detection and tracking.

  • PDF Download Icon
  • Research Article
  • Cite Count Icon 8
  • 10.1051/epjconf/20135005001
PEPT: An invaluable tool for 3-D particle tracking and CFD simulation verification in hydrocyclone studies
  • Jan 1, 2013
  • EPJ Web of Conferences
  • Yu-Fen Chang + 3 more

Particle tracks in a hydrocyclone generated both experimentally by positron emission particle tracking (PEPT) and numerically with Eulerian-Lagranian CFD have been studied and compared. A hydrocyclone with a cylinder-on-cone design was used in this study, the geometries used in the CFD simulations and in the experiments being identical. It is shown that it is possible to track a fast-moving particle in a hydrocyclone using PEPT with high temporal and spatial resolutions. The numerical 3-D particle trajectories were generated using the Large Eddy Simulation (LES) turbulence model for the fluid and Lagrangian particle tracking for the particles. The behaviors of the particles were analyzed in detail and were found to be consistent between experiments and CFD simulations. The tracks of the particles are discussed and related to the fluid flow field visualized in the CFD simulations using the cross-sectional static pressure distribution.

  • Research Article
  • Cite Count Icon 22
  • 10.1007/s00348-021-03172-0
Reconstructing velocity and pressure from noisy sparse particle tracks using constrained cost minimization
  • Mar 23, 2021
  • Experiments in Fluids
  • Karuna Agarwal + 4 more

Emerging time-resolved volumetric PIV techniques have made simultaneous measurements of velocity and pressure fields possible. Yet, in many experimental setups, satisfying the spatial and temporal resolution requirements is a challenge. To improve the quality of sparse and noisy data, this paper introduces a constrained cost minimization (CCM) technique, which interpolates unstructured particle tracks to obtain the velocity, velocity gradients, material acceleration, hence the pressure, on a Eulerian grid. This technique incorporates physical constraints, such as a divergence-free velocity field and curl-free pressure gradients. The performance is evaluated using synthetic particle tracks for an unsteady double gyre and direct numerical simulations data for a turbulent channel flow, with varying particle concentrations and added errors. The errors in pressure, calculated using omni-directional integration, and correlations with the original data are compared to those obtained using the singular value decomposition (SVD) interpolation technique. The CCM errors are mostly lower, and the correlation is higher and less sensitive to particle sparsity and added errors compared to those of SVD. The synthetic particle traces are also projected onto four planar images to evaluate the performance of the new procedure together with shake-the-box (STB) particle tracking. A comparison of pressure spectra and correlation with the original data show very good agreement for the CCM method. Hence, CCM appears to be an effective method for improving the interpolation of sparse data. Sample experimental data obtained in the shear layer behind a backward-facing step demonstrate the application of STB and CCM to resolve the pressure field in coherent vortex structures.

  • Research Article
  • Cite Count Icon 1
  • 10.1111/cgf.14304
Leveraging Topological Events in Tracking Graphs for Understanding Particle Diffusion
  • Jun 1, 2021
  • Computer Graphics Forum
  • T Mcdonald + 8 more

Single particle tracking (SPT) of fluorescent molecules provides significant insights into the diffusion and relative motion of tagged proteins and other structures of interest in biology. However, despite the latest advances in high‐resolution microscopy, individual particles are typically not distinguished from clusters of particles. This lack of resolution obscures potential evidence for how merging and splitting of particles affect their diffusion and any implications on the biological environment. The particle tracks are typically decomposed into individual segments at observed merge and split events, and analysis is performed without knowing the true count of particles in the resulting segments. Here, we address the challenges in analyzing particle tracks in the context of cancer biology. In particular, we study the tracks of KRAS protein, which is implicated in nearly 20% of all human cancers, and whose clustering and aggregation have been linked to the signaling pathway leading to uncontrolled cell growth. We present a new analysis approach for particle tracks by representing them as tracking graphs and using topological events – merging and splitting, to disambiguate the tracks. Using this analysis, we infer a lower bound on the count of particles as they cluster and create conditional distributions of diffusion speeds before and after merge and split events. Using thousands of time‐steps of simulated and in‐vitro SPT data, we demonstrate the efficacy of our method, as it offers the biologists a new, detailed look into the relationship between KRAS clustering and diffusion speeds.

  • Research Article
  • Cite Count Icon 19
  • 10.1016/j.jhydrol.2005.09.003
Determination of stochastic well head protection zones by direct propagation of uncertainties of particle tracks
  • Oct 25, 2005
  • Journal of Hydrology
  • Harald Kunstmann + 1 more

Determination of stochastic well head protection zones by direct propagation of uncertainties of particle tracks

  • Research Article
  • Cite Count Icon 27
  • 10.1007/s00348-017-2390-2
Large-scale volumetric flow measurement in a pure thermal plume by dense tracking of helium-filled soap bubbles
  • Aug 3, 2017
  • Experiments in Fluids
  • Florian Huhn + 5 more

We present a spatially and temporally highly resolved flow measurement covering a large volume (~0.6 m3) in a pure thermal plume in air. The thermal plume develops above an extended heat source and is characterized by moderate velocities (U ~ 0.35 m/s) with a Reynolds number of $$\text{Re} \sim 500$$ and a Rayleigh number of $${\text{Ra}}\sim 10^{6}$$ . We demonstrate the requirements and capabilities of the measurement equipment and the particle tracking approach to be able to probe measurement volumes up to and beyond one cubic meter. The use of large tracer particles (300 μm), helium-filled soap bubbles (HFSBs), is crucial and yields high particle image quality over large-volume depths when illuminated with arrays of pulsed high-power LEDs. The experimental limitations of the HFSBs—their limited lifetime and their intensity loss over time—are quantified. The HFSBs’ uniform particle images allows an accurate reconstruction of the flow using Shake-The-Box particle tracking with high particle concentrations up to 0.1 particles per pixel. This enables tracking of up to 275,000 HFSBs simultaneously. After interpolating the scattered data onto a regular grid with a Navier–Stokes regularization, the velocity field of the thermal plume reveals a multitude of vortices with a smooth temporal evolution and a remarkable coherence in time (see animation, supplementary data). Acceleration fields are also derived from interpolated particle tracks and complement the flow measurement. Additionally, the flow map, the basis of a large class of Lagrangian coherent structures, is computed directly from observed particle tracks. We show entrainment regions and coherent vortices of the thermal plume in the flow map and compute fields of the finite-time Lyapunov exponent.

  • Dissertation
  • 10.11588/heidok.00022581
Diagnosing Software Configuration Errors via Static Analysis
  • Jan 1, 2017
  • Zhao Dong

Software misconfiguration is responsible for a substantial part of today's system failures, causing about one quarter of all user-reported issues. Identifying their root causes can be costly in terms of time and human resources. To reduce the effort, researchers from industry and academia have developed many techniques to assist software engineers in troubleshooting software configuration. Unfortunately, there exist some challenges in applying these techniques to diagnose software misconfigurations considering that data or operations they require are difficult to achieve in practice. For instance, some techniques rely on a data base of configuration data, which is often not publicly available for reasons of data privacy. Some techniques heavily rely on runtime information of a failure run, which requires to reproduce a configuration error and rerun misconfigured systems. Reproducing a configuration error is costly since misconfiguration is highly relevant to operating environment. Some other techniques need testing oracles, which challenges ordinary end users. This thesis explores techniques for diagnosing configuration errors which can be deployed in practice. We develop techniques for troubleshooting software configuration, which rely on static analysis of a software system and do not need to execute the application. The source code and configuration documents of a system required by the techniques are often available, especially for open source software programs. Our techniques can be deployed as third-party services. The first technique addresses configuration errors due to erroneous option values. Our technique analyzes software programs and infer whether there exists an possible execution path from where an option value is loaded to the code location where the failure becomes visible. Options whose values might flow into such a crashing site are considered possible root causes of the error. Finally, we compute the correlation degrees of these options with the error using stack traces information of the error and rank them. The top-ranked options are more likely to be the root cause of the error. Our evaluation shows the technique is highly effective in diagnosing the root causes of configuration errors. The second technique automatically extracts names of options read by a program and their read points in the source code. We first identify statements loading option values, then infer which options are read by each statement, and finally output a map of these options and their read points. With the map, we are able to detect options in the documents which are not read by the corresponding version of the program. This allows locating configuration errors due to inconsistencies between configuration documents and source code. Our evaluation shows that the technique can precisely identify option read points and infer option names, and discovers multiple previously unknown inconsistencies between documented options and source code.

  • Research Article
  • 10.5194/esurf-13-549-2025
Computational sedimentation modelling calibration: a tool to measure the settling velocity under different gravity conditions
  • Jul 14, 2025
  • Earth Surface Dynamics
  • Nikolaus J Kuhn + 1 more

Abstract. Research in zero or reduced gravity is essential to prepare and support planetary sciences and space exploration. In this study, an instrument specifically designed to measure the settling velocity of sediment particles under normal-gravity, hypergravity and reduced-gravity conditions is presented. The lower gravity on Mars potentially reduces drag on particles settling in water, which in turn may affect the texture of sedimentary rocks forming in a standing or moving body of water with settling particles. An environment to test such potential errors is the parabolic flight, which offers reduced gravity for up to 30 s. Exact tracing of particle tracks while settling is essential to assess the impact of gravity on flow hydraulics, drag and settling velocity. In this study, we present an advanced version of previous instruments, including the approach to particle tracking and track analysis. The trajectories of particles settling in water were recorded under reduced Martian and lunar gravity, the hypergravity phases during the pull-up of the plane and at terrestrial gravity on Earth. The data were used to compute the terminal settling velocity of isolated and small groups of particles and compared with the results calculated using a semi-theoretical formula derived in 2004 by Ferguson and Church (Ferguson and Church, 2004). The analysis showed that with improved design of settling chambers, particle recording and tracking, a highly precise measurement of settling velocity is possible. This illustrates that the parabolic flight environment is suited not just for broad qualitative comparisons between different gravity environments but also for highly precise data acquisition on flow hydraulics associated with particle settling.

  • Research Article
  • Cite Count Icon 21
  • 10.1016/j.nima.2018.06.007
Detailed investigation on the possibility of using EJ-299-33A plastic scintillator for fast neutron spectroscopy in large scale experiments
  • Jun 6, 2018
  • Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment
  • Pratap Roy + 7 more

Detailed investigation on the possibility of using EJ-299-33A plastic scintillator for fast neutron spectroscopy in large scale experiments

  • Research Article
  • 10.18409/ispiv.v1i1.98
Kalman filtering approach to particle track filtering and track uncertainty quantification for 3D PTV measurement
  • Aug 1, 2021
  • 14th International Symposium on Particle Image Velocimetry
  • Rudra Sethu Viji + 5 more

Three-dimensional Particle Tracking Velocimetry (3D-PTV) is a non-invasive flow measurement technique that computes the velocity field by reconstructing 3D particle positions of individual tracer particles and by subsequently tracking those positions. The particle velocity measurement accuracy depends on the faithful reconstruction of 3D particle positions. The complex measurement chain in 3D-PTV involves several steps, from calibration to 3D position reconstruction and particle position tracking, each having its own source of error. Additionally, higher seeding density increases the uncertainty in particle reconstruction and tracking, which in turn, increases the noise in the estimated tracks. A noisy track decreases the measurement accuracy and amplifies any noise in the PTV-derived quantities of interest, which includes acceleration, pressure and vorticity. Thus, track filtering techniques are critical in a 3D-PTV measurement. Track fitting using polynomial functions, filtering methods adopted from signal processing and object tracking are among the well-established techniques used to achieve smooth position, velocity estimates from reconstructed particle trajectories. The Kalman filter is one such filtering technique that is widely used in various applications. The strength of the Kalman filter lies in its ability to perform noise reduction that is informed by existing physical models and the uncertainty estimates of recorded measurements. However, the measurement uncertainty input to the Kalman filter needs to be known at priori, which in many cases may not be available or could be difficult to estimate. In the literature on Kalman filters and their variants applied to 2D-PIV/PTV, the position uncertainty data fed to the filter is either user-defined or estimated based on global noise levels in the PTV measurements. But instantaneous position and velocity uncertainty quantification for individual particle positions/tracks has been challenging in the 3D PTV community. Recent work by Bhattacharya and Vlachos (2020) provides an estimate of the uncertainty in the reconstructed particle positions for a 3D PTV measurement. This position uncertainty estimate dynamically updates the filter gain for each track and enables the evaluation of the performance of the Kalman filter in 3D PTV track filtering.

  • Research Article
  • 10.14288/1.0220788
Cluster counting in drift chambers for particle identification and tracking
  • Jan 1, 2016
  • Jean‐François Caron

Drift chambers are a type of gaseous ionization detector used in high-energy physics experiments. They can identify charged particles and measure their momentum. When a high-energy charged particle crosses the drift chamber, it ionizes the gas. The liberated electrons drift towards positive-high-voltage wires where an ionization avalanche amplifies the signal. Traditional drift chambers use only the arrival time of the cluster of charge from the closest ionization for tracking, and use only the integral of the whole signal for particle identification. We constructed prototype drift chambers with the ability to resolve the charge cluster signals from individual ionization events. Different algorithms were studied and optimized to best detect the clusters. The improvements to particle identification were studied using a single-cell prototype detector, while the improvements to particle tracking were studied using a multiple-layer prototype. The prototypes were built in the context of initial work for the now-cancelled SuperB project, but the results apply to any drift chambers used in flavour-factory experiments. The results show that the choice of algorithm is not as critical as properly optimizing the algorithm parameters for the dataset. We find that a smoothing time of a few nanoseconds is optimal. This corresponds to bandwidth of a few hundred megahertz, indicating that gigahertz-bandwidth electronics are not required to make use of this technique. Particle identification performance is quantified by the fraction of real pions correctly identified as pions with at most 10% of real pions mis-identified as muons. In our single-cell prototype, the performance increases from 50% to 60% of pions correctly identified when cluster counting is combined with a traditional truncated-mean charge measurement, compared to the charge measurement alone. Tracking performance is quantified by the single-cell resolution: the uncertainty in measuring the distance of charged particle tracks from a given sense wire. In our multiple-layer prototype, the single-cell tracking resolution using traditional methods is measured to be ~150μm. With cluster counting implemented, the resolution is unchanged, indicating that the additional cluster information is not useful.

More from: Computing and Software for Big Science
  • New
  • Research Article
  • 10.1007/s41781-025-00148-1
Enforcing Fundamental Relations via Adversarial Attacks on Input Parameter Correlations
  • Nov 5, 2025
  • Computing and Software for Big Science
  • Lucie Flek + 7 more

  • Research Article
  • 10.1007/s41781-025-00146-3
Application of Geometric Deep Learning for Tracking of Hyperons in a Straw Tube Detector
  • Oct 21, 2025
  • Computing and Software for Big Science
  • Adeel Akram + 5 more

  • Research Article
  • 10.1007/s41781-025-00133-8
Analysis Facilities for the HL-LHC White Paper
  • Jul 13, 2025
  • Computing and Software for Big Science
  • D Ciangottini + 65 more

  • Research Article
  • 10.1007/s41781-025-00143-6
Performance Portability of the Particle Tracking Algorithm Using SYCL
  • Jul 1, 2025
  • Computing and Software for Big Science
  • Bartosz Soból + 3 more

  • Research Article
  • 10.1007/s41781-025-00142-7
PhyLiNO: a forward-folding likelihood-fit framework for neutrino oscillation physics
  • Jul 1, 2025
  • Computing and Software for Big Science
  • Denise Hellwig + 4 more

  • Research Article
  • 10.1007/s41781-025-00140-9
SymbolFit: Automatic Parametric Modeling with Symbolic Regression
  • Jul 1, 2025
  • Computing and Software for Big Science
  • Ho Fung Tsoi + 8 more

  • Research Article
  • 10.1007/s41781-025-00141-8
A Downstream and Vertexing Algorithm for Long Lived Particles (LLP) Selection at the First High Level Trigger (HLT1) of LHCb
  • Jul 1, 2025
  • Computing and Software for Big Science
  • V Kholoimov + 4 more

  • Research Article
  • 10.1007/s41781-025-00137-4
oidc-agent - Integrating OpenID Connect Tokens with the Command Line
  • May 22, 2025
  • Computing and Software for Big Science
  • Gabriel Zachmann + 2 more

  • Research Article
  • 10.1007/s41781-025-00138-3
KAN We Improve on HEP Classification Tasks? Kolmogorov–Arnold Networks Applied to an LHC Physics Example
  • May 22, 2025
  • Computing and Software for Big Science
  • Johannes Erdmann + 2 more

  • Research Article
  • 10.1007/s41781-025-00139-2
An automated bandwidth division for the LHCb upgrade trigger
  • May 21, 2025
  • Computing and Software for Big Science
  • T Evans + 2 more

Save Icon
Up Arrow
Open/Close
  • Ask R Discovery Star icon
  • Chat PDF Star icon

AI summaries and top papers from 250M+ research sources.

Search IconWhat is the difference between bacteria and viruses?
Open In New Tab Icon
Search IconWhat is the function of the immune system?
Open In New Tab Icon
Search IconCan diabetes be passed down from one generation to the next?
Open In New Tab Icon