Abstract

An important part of the Large Hadron Collider (LHC) legacy will be precise limits on indirect effects of new physics, framed for instance in terms of an effective field theory. These measurements often involve many theory parameters and observables, which makes them challenging for traditional analysis methods. We discuss the underlying problem of “likelihood-free” inference and present powerful new analysis techniques that combine physics insights, statistical methods, and the power of machine learning. We have developed MadMiner, a new Python package that makes it straightforward to apply these techniques. In example LHC problems we show that the new approach lets us put stronger constraints on theory parameters than established methods, demonstrating its potential to improve the new physics reach of the LHC legacy measurements. While we present techniques optimized for particle physics, the likelihood-free inference formulation is much more general, and these ideas are part of a broader movement that is changing scientific inference in fields as diverse as cosmology, genetics, and epidemiology.

Highlights

  • Measurements of particle collisions at the Large Hadron Collider (LHC) provide a wealth of high-dimensional data x. This data is used to measure phenomena predicted by the Standard Model of particle physics (SM), and to search for indications of physics from beyond the SM

  • In the effective field theory (EFT) context, these parameters are known as Wilson coefficients, but the following discussion applies for other physical properties

  • Histograms are often used for this purpose in high energy physics

Read more

Summary

The inference challenge

Measurements of particle collisions at the Large Hadron Collider (LHC) provide a wealth of high-dimensional data x. With an effective field theory (EFT) approach, possible variations from the SM can be parameterized and studied systematically [1,2,3] This requires the simultaneous measurement of many parameters θ. Samples of collision processes at the LHC are generated via a chain of Monte Carlo simulators. They describe the the hard scattering of partons zp, subsequent parton showering and hadronization, as well as interactions with the detector and signal propagation up to a set of measurements of reconstructed quantities x. By running the simulation chain, it is possible to generate samples drawn from p(x|θ) for various settings of θ

Established inference techniques
Histograms of summary statistics
Matrix element method
The usefulness of gold
General approach
Available algorithms
The MadMiner package
Exemplary workflow
Achieving optimal performance
Inclusion of systematic uncertainties
High energy physics and beyond
Findings
Conclusions
Full Text
Published version (Free)

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call