Abstract
The current PDF4LHC recommendation to estimate uncertainties due to parton distribution functions (PDFs) in theoretical predictions for LHC processes involves the combination of separate predictions computed using PDF sets from different groups, each of which comprises a relatively large number of either Hessian eigenvectors or Monte Carlo (MC) replicas. While many fixed-order and parton shower programs allow the evaluation of PDF uncertainties for a single PDF set at no additional CPU cost, this feature is not universal, and, moreover, the a posteriori combination of the predictions using at least three different PDF sets is still required. In this work, we present a strategy for the statistical combination of individual PDF sets, based on the MC representation of Hessian sets, followed by a compression algorithm for the reduction of the number of MC replicas. We illustrate our strategy with the combination and compression of the recent NNPDF3.0, CT14 and MMHT14 NNLO PDF sets. The resulting compressed Monte Carlo PDF sets are validated at the level of parton luminosities and LHC inclusive cross sections and differential distributions. We determine that around 100 replicas provide an adequate representation of the probability distribution for the original combined PDF set, suitable for general applications to LHC phenomenology.
Highlights
IntroductionSome of the most widely used Monte Carlo event generators, such as MadGraph5_aMC@NLO [37,38] or POWHEG [39], and NNLO codes like FEWZ [40], allow computation of parton distribution functions (PDFs) uncertainties at no extra cost
We show how the compression strategy can be applied to native Monte Carlo parton distribution functions (PDFs) sets, using the NNPDF3.0 NLO set with Nrep = 1000 as an illustration
In order to illustrate the performance of the compression algorithm, we consider here the compression of a native Monte Carlo set of PDFs at Q0 = 1 GeV, based on the prior set with Nrep = 1000 replicas of NNPDF3.0 NLO
Summary
Some of the most widely used Monte Carlo event generators, such as MadGraph5_aMC@NLO [37,38] or POWHEG [39], and NNLO codes like FEWZ [40], allow computation of PDF uncertainties at no extra cost This is not the case for all the theory tools used for the LHC experiments, and even when this feature is available, in the case of the envelope method the a posteriori combination of the results obtained with the three sets still needs to be performed, which can be quite cumbersome (as well as error-prone) especially in the case of exclusive calculations that require very large event files. Using the dataset diagonalization method proposed in [43], it is possible to further reduce the number of eigenvectors in the Meta-PDF sets for specific physical applications, such as for Higgs production processes
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.