Abstract

The evaluation of higher-order cross-sections is an important component in the search for new physics, both at hadron colliders and elsewhere. For most new physics processes of interest, total cross-sections are known at next-to-leading order (NLO) in the strong coupling alpha _s, and often beyond, via either higher-order terms at fixed powers of alpha _s, or multi-emission resummation. However, the computation time for such higher-order cross-sections is prohibitively expensive, and precludes efficient evaluation in parameter-space scans beyond two dimensions. Here we describe the software tool xsec, which allows for fast evaluation of cross-sections based on the use of machine-learning regression, using distributed Gaussian processes trained on a pre-generated sample of parameter points. This first version of the code provides all NLO Minimal Supersymmetric Standard Model strong-production cross-sections at the LHC, for individual flavour final states, evaluated in a fraction of a second. Moreover, it calculates regression errors, as well as estimates of errors from higher-order contributions, from uncertainties in the parton distribution functions, and from the value of alpha _s. While we focus on a specific phenomenological model of supersymmetry, the method readily generalises to any process where it is possible to generate a sufficient training sample.

Highlights

  • The computational cost of evaluation per parameter point restricts the usage of next-to-leading order (NLO) crosssections to new physics models with only one or two relevant parameters

  • The central approximation that allows for computational gains in distributed Gaussian processes (DGPs) and related approaches is an assumption that the individual experts can be treated as independent, which corresponds to approximating the kernel matrix of the combined problem, i.e. without partition into experts, as block-diagonal

  • The reader may wonder at this point why we train our Gaussian process (GP) on the total crosssection in terms of all the non-degenerate masses, rather than on the NLO corrections

Read more

Summary

Gaussian process regression

The basic objective in regression is to estimate an unknown function value f (x) at some new x point, given that we know the function values at some other x points. A Bayesian approach to this task is provided by Gaussian process regression, in which we express our degree of belief about any set of function values as a joint Gaussian pdf. Which formally describes our degree of belief for possible function values at both the training points X and the test point x∗, before we look at the training data. This prior is chosen indirectly by choosing a mean function m(·) and a covariance function or kernel k(·, ·), defined to specify the following expectation values for arbitrary input points:. The mean and variance of this univariate Gaussian can be expressed in closed form as

Kernel choice and optimisation
Regularisation of the covariance matrix
Distributed Gaussian processes and prediction aggregation
Sample generation
Calculation of NLO training cross-sections
Training implementation details
Validation
Hu and m
Gluino pair production
Pair production of gluinos with first or second-generation squarks
First and second-generation squark–anti-squark pair production
First and second-generation squark pair production
Stop and sbottom pair production
Speeding up cross-section evaluation
Installation
Python interface
Cross-section prediction
Command-line interface
Code structure
Findings
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call