Abstract

The reconstruction of charged particles will be a key computing challenge for the high-luminosity Large Hadron Collider (HL-LHC) where increased data rates lead to a large increase in running time for current pattern recognition algorithms. An alternative approach explored here expresses pattern recognition as a quadratic unconstrained binary optimization (QUBO), which allows algorithms to be run on classical and quantum annealers. While the overall timing of the proposed approach and its scaling has still to be measured and studied, we demonstrate that, in terms of efficiency and purity, the same physics performance of the LHC tracking algorithms can be achieved. More research will be needed to achieve comparable performance in HL-LHC conditions, as increasing track density decreases the purity of the QUBO track segment classifier.

Highlights

  • Quantum computers are rapidly being made available both in the cloud and as prototypes in academic and industrial settings

  • We present an alternative approach, one that expresses pattern recognition as a quadratic unconstrained binary optimization (QUBO; a NP-hard problem) using annealing, a process to find the global minimum of an objective function—in our case a quadratic function over binary variables based on the algorithm introduced in Ref. [5] following ideas in Refs. [6, 7]

  • We study the performance of the algorithm as a function of the particle multiplicity

Read more

Summary

Introduction

Quantum computers are rapidly being made available both in the cloud and as prototypes in academic and industrial settings. These devices span the range from D-Wave [1] commercial quantum annealers to gate-based quantum processor prototypes based on a wide range of promising technologies [2]. Quantum computing holds the potential for super-polynomial speedups and large decrease in energy usage, if suitable algorithms can be developed. The reconstruction of charged particles will be a key computing challenge for the high-luminosity Large Hadron Collider (HL-LHC) where increased data rates lead to a large increase in running time for conventional pattern recognition algorithms. Conventional algorithms [3, 4], which are based on combinatorial track seeding and building, scale quadratically or worse as a function of the detector occupancy

Methods
Results
Discussion
Conclusion
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call