Abstract

We propose to replace the exact amplitudes used in Monte Carlo event generators for trained machine learning regressors, with the aim of speeding up the evaluation of slow amplitudes. As a proof of concept, we study the process $gg\ensuremath{\rightarrow}ZZ$, whose leading-order amplitude is loop induced. We show that gradient boosting machines like xgboost can predict the fully differential distributions with errors below 0.1%, and with prediction times $\mathcal{O}({10}^{3})$ faster than the evaluation of the exact function. This is achieved with training times $\ensuremath{\sim}\text{ }23\text{ }\text{ }\mathrm{minutes}$ and regressors of size $\ensuremath{\lesssim}\text{ }22\text{ }\text{ }\mathrm{Mb}$. We also find that xgboost performs well over the entire phase space, while interpolation gives much larger errors in regions where the function is peaked. These results suggest a possible new avenue to speed up Monte Carlo event generators.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call