Abstract

Data quality monitoring (DQM) is an important and integral part of the data taking process of HEP experiments. DQM involves automated analysis of monitoring data through user-defined algorithms and relaying the summary of the analysis results while data is being processed. When DQM occurs in the online environment, it provides the shifter with current run information that can be used to overcome problems early on. During the offline reconstruction, more complex analysis of physics quantities is performed by DQM, and the results are used to assess the quality of the reconstructed data. The ATLAS data quality monitoring framework (DQMF) is a distributed software system providing DQM functionality in the online environment. The DQMF has a scalable architecture achieved by distributing execution of the analysis algorithms over a configurable number of DQMF agents running on different nodes connected over the network. The core part of the DQMF is designed to only have dependence on software that is common between online and offline (such as ROOT) and therefore is used in the offline framework as well. This paper describes the main requirements, the architectural design, and the implementation of the DQMF.

Highlights

  • ATLAS is one of the four experiments at the Large Hadron Collider (LHC) [2] at CERN

  • Data Quality Monitoring Framework (DQMF) interacts with the Online Monitoring Services as well as with some other Online Services, provided as part of the ATLAS TDAQ [4] software infrastructure (Fig. 1), in order to be able to fulfill its objectives, in particular:

  • Online Histogramming Service (OH) is used to retrieve histograms produced in the current run and to transmit requests to histogram providers

Read more

Summary

27 February 2008

Abstract - Data Quality Monitoring (DQM) is an important and integral part of the data taking process of HEP experiments. DQM involves automated analysis of monitoring data through user-defined algorithms and relaying the summary of the analysis results while data is being processed. When DQM occurs in the online environment, it provides the shifter with current run information that can be used to overcome problems early on. More complex analysis of physics quantities is performed by DQM, and the results are used to assess the quality of the reconstructed data. The ATLAS Data Quality Monitoring Framework (DQMF) is a distributed software system providing DQM functionality in the online environment. The core part of the DQMF is designed to only have dependence on software that is common between online and offline (such as ROOT [1]) and is used in the offline framework as well. This paper describes the main requirements, the architectural design, and the implementation of the DQMF

INTRODUCTION
GENERAL DESCRIPTION
DQ Results
MAIN REQUIREMENTS
ARCHITECTURAL DESIGN
SCALABILITY
DQM Workbench
IMPLEMENTATION
DQA offline
CONCLUSIONS
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call