Abstract
The discovery of particles that shape our universe pushes the scientific community to increasingly build sophisticated equipments. Particle accelerators are one of these complex machines that put known particle beams on a collision course at speeds close to that of light. The Large Hadron Collider (LHC) is the world's largest and most powerful beam collider, operating with 13 TeV of energy collision and 25 ns of bunch-crossing interval. ATLAS is the largest LHC experiment, comprising several subsystems which provide data fusion to reconstruct each collision. When collisions occur, subproducts are produced and measured by the calorimeter system, which absorbs these subproducts. Typically, a high-energy calorimeter is highly segmented, comprising thousands of dedicated readout channels. The present work evaluates the performance of two cell energy reconstruction algorithms that operate in the ATLAS Tile Calorimeter (TileCal): the baseline algorithm OF2 (Optimal Filter) and COF (Constrained Optimal Filter), which was recently proposed to deal with the signal superposition (pile-up) that is, increscent, present in LHC operation. In order to evaluate the energy estimation efficiency, real data acquired during the nominal LHC operation at high luminosity condition were used. The statistics from the energy estimation is employed to compare the performance achieved by each method. The results show that the COF method presents a better performance than the OF2 method, pointing out benefits from using this alternative estimation method.
Highlights
Since the most remote times, the humanity searches for the universe origin and composition
The largest particle collider currently in operation is the Large Hadron Collider (LHC), which was built at CERN and comprises a ring of 27 kilometers at approximately 100 meters underneath the ground
Searching for new physics, modern high event rate experiments, such as the ATLAS experiment at the LHC face an unprecedented increase on the number of interactions per collision, pushing the calorimeters design to the limit in order to deal with the immense amount of data that is produced and the resulting pile-up effect on signal reconstruction
Summary
Since the most remote times, the humanity searches for the universe origin and composition. A large number of subatomic particles have been identified and their properties were exhaustively explored. High-energy particle colliders are machines that are used to understand the fundamental composition of the universe [1]. In the LHC machine, protons beams are accelerated approximately to the speed of light and put in a frontal collision route at every 25 ns, reaching a maximum energy of 13 TeV. LHC has been gradually increasing its luminosity, reaching unprecedented conditions and exploring a large and ambitious physics program. ATLAS [3] is the largest LHC experiment and plays a fundamental role in particle detection research, as it covers a large physics program, including the Higgs boson discovery and characterization [4] and possible beyond Standard Model physics [1]. The ATLAS experiment is composed by several subdetectors: a particle tracking system, which measures the momentum of charged particles, the calorimeter (highly-segmented energy measurement system) and a muon chamber, which detects and measures the muon momentum [5]
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.