Abstract

We present powerful new analysis techniques to constrain effective field theories at the LHC. By leveraging the structure of particle physics processes, we extract extra information from MonteCarlo simulations, which can be used to train neural network models that estimate the likelihood ratio. These methods scale well to processes with many observables and theory parameters, do not require any approximations of the parton shower or detector response, and can be evaluated in microseconds. We show that they allow us to put significantly stronger bounds on dimension-six operators than existing methods, demonstrating their potential to improve the precision of the LHC legacy constraints.

Highlights

  • We present powerful new analysis techniques to constrain effective field theories at the Large Hadron Collider (LHC)

  • By leveraging the structure of particle physics processes, we extract extra information from Monte Carlo simulations, which can be used to train neural network models that estimate the likelihood ratio. These methods scale well to processes with many observables and theory parameters, do not require any approximations of the parton shower or detector response, and can be evaluated in microseconds. We show that they allow us to put significantly stronger bounds on dimension-six operators than existing methods, demonstrating their potential to improve the precision of the LHC legacy constraints

  • Introduction.—Precision constraints on indirect signatures of physics beyond the standard model (SM) will be an important part of the legacy of the Large Hadron Collider (LHC) experiments

Read more

Summary

Constraining Effective Field Theories with Machine Learning

We present powerful new analysis techniques to constrain effective field theories at the LHC. By leveraging the structure of particle physics processes, we extract extra information from Monte Carlo simulations, which can be used to train neural network models that estimate the likelihood ratio These methods scale well to processes with many observables and theory parameters, do not require any approximations of the parton shower or detector response, and can be evaluated in microseconds. We want to highlight the key idea: by harnessing the structure of particle physics processes, we can extract additional information from Monte Carlo simulations that characterizes the dependence of the likelihood on the theory parameters This augmented data can be used to train neural networks that precisely estimate likelihood ratios, the preferred test statistics for limit setting at the LHC.

Published by the American Physical Society
Findings
LSM þ
Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call