We present a collection of tools efficiently automating the computation of large sets of theory predictions for high-energy physics. Calculating predictions for different processes often require dedicated programs. These programs, however, accept inputs and produce outputs that are usually very different from each other. The industrialization of theory predictions is achieved by a framework which harmonizes inputs (runcard, parameter settings), standardizes outputs (in the form of grids), produces reusable intermediate objects, and carefully tracks all meta data required to reproduce the computation. Parameter searches and fitting of non-perturbative objects are exemplary use cases that require a full or partial re-computation of theory predictions and will thus benefit of such a toolset. As an example application we present a study of the impact of replacing NNLO QCD K-factors with the exact NNLO predictions in a PDF fit. Program summaryProgram Title:pinelineCPC Library link to program files:https://doi.org/10.17632/dyvns7gnwy.1Developer's repository link:https://nnpdf.github.io/pineline/Licensing provisions: GPLv3Programming language: Python, RustNature of the problem: The computation of theoretical quantities in particle physics often involves computationally-intensive tasks such as the calculation of differential cross sections in a systematic and reproducible way. Different groups often use different conventions and choices which makes tasks such as the fitting of physical parameters or quantities very computationally challenging and hard to reliably reproduce.Solution method: We create a pipeline of tools such that a user can define an observable and a theory framework and obtain a final object, containing all relevant theoretical information. Such objects can be then used in a variety of interchangeable ways (fitting, analysis, experimental comparisons).
Read full abstract