Abstract

Since 2009 when the LHC came back to active service, the Data Quality Monitoring (DQM) team was faced with the need to homogenize and automate operations across all the different environments within which DQM is used. The main goal of automation is to reduce the operator intervention at the minimum possible level, especially in the area of DQM files management, where long-term archival presented the greatest challenges. Manually operated procedures cannot cope with the constant increase in luminosity, datasets and uptime of the CMS detector. Therefore a solid and reliable set of sophisticated scripts, the agents, has been designed since the beginning to manage all DQM-related workflows. This allows to fully exploiting all available resources in every condition, maximizing the performance and reducing the latency in making data available for validation and certification. The agents can be easily fine-tuned to adapt to current and future hardware constraints and proved to be flexible enough to include unforeseen features, like an ad-hoc quota management and a real time sound alarm system.

Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call