Abstract
The Data Quality Monitoring Software proved to be a central tool in the Compact Muon Solenoid experiment. Its flexibility allowed its integration in several environments: online, for real-time detector monitoring; offline, for the final, fine-grained data certification. The usage of the Data Quality Monitoring software in the different environments and its integration in the Compact Muon Solenoid reconstruction software framework and in all production work-flows are presented. The main technical challenges and the adopted solutions to them will be also discussed with emphasis on functionality, longterm robustness and performance.
Highlights
T HE Large Hadron Collider (LHC) [3] at CERN [4] collides protons together at close to the speed of light
The Data Quality Monitoring Software proved to be a central tool in the Compact Muon Solenoid experiment
Some of the collision energy is turned into mass, creating new particles which are observed in the Compact Muon Solenoid (CMS) [5] particle detector
Summary
T HE Large Hadron Collider (LHC) [3] at CERN [4] collides protons together at close to the speed of light. Some of the collision energy is turned into mass, creating new particles which are observed in the Compact Muon Solenoid (CMS) [5] particle detector. CMS data is analyzed by physicists around the world to reconstruct a picture of what happened at the heart of these collisions. Data quality monitoring (DQM) is a crucial part of the experiment. Its purpose is to identify errors and problems in the detector hardware or reconstruction software. The ultimate goal is a stable detector leading to high quality reconstructed collision events. There are two main branches of the monitoring framework, online and offline, and each is discussed in detail below
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have