Abstract

The Belle II detector began collecting data from e+e− collisions at the SuperKEKB electron-positron collider in March 2019. Belle II aims to collect a data sample 50 times larger than the previous generation of B-factories. For Belle II analyses to be competitive it is crucial that calibration payloads for this data are calculated promptly prior to data reconstruction. To accomplish this goal a Python plugin package has been developed based on the open-source Apache Airflow package; using Directed Acyclic Graphs (DAGs) to describe the ordering of processes and Flask to provide administration and job submission web pages. DAGs for calibration process submission, monitoring of incoming data files, and validation of calibration payloads have all been created to help automate the calibration procedure. Flask plugin classes have been developed to extend the built-in Airflow administration and monitoring web pages. Authentication was included through the use of the pre-existing X.509 grid certificates of Belle II users.

Highlights

  • In March 2019 the Belle II detector began collecting data from e+e− collisions at the SuperKEKB electron-positron collider [1]

  • Particle Identification (PID) efficiency and B meson vertex resolution strongly affect the statistical power of most physics analyses

  • There are benefits to providing prompt datasets, available on a timescale of approximately two weeks, that are as close to publication quality as possible

Read more

Summary

Introduction

In March 2019 the Belle II detector began collecting data from e+e− collisions at the SuperKEKB electron-positron collider [1]. It allows physicists to begin their analyses early and switch to official publication datasets for the final results. It provides an API to parallelize job submission of CAF basf processes on different batch system backends and is used by most calibrations at Belle II. It allows processing of large amounts of data to create calibration payload files, similar to the workflows used for automated calibration at the CMS and LHCb experiments [6, 7], but the CAF can be run at any computing centre so long as the necessary input data is located there

Belle II data processing
Current prompt calibration procedure
Problems facing the current situation
Airflow overview
Prompt calibration in Airlfow
Early operation

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.