Abstract

We present a compression algorithm for parton densities using synthetic replicas generated from the training of a generative adversarial network (GAN). The generated replicas are used to further enhance the statistics of a given Monte Carlo PDF set prior to compression. This results in a compression methodology that is able to provide a compressed set with smaller number of replicas and a more adequate representation of the original probability distribution. We also address the question of whether the GAN could be used as an alternative mechanism to avoid the fitting of large number of replicas.

Highlights

  • Parton distribution functions (PDFs) are crucial ingredients for all predictions of physical observables at hadron colliders such as the LHC, and efforts to push their uncertainties to smaller values are becoming increasingly relevant

  • We quantify the performance of the GANenhanced compression methodology described in the previous section based on various statistical estimators

  • In order to estimate how good the generative adversarial network (GAN)-compressor framework is compared to the previous methodology, we subject the compressed sets resulting from both methodolo

Read more

Summary

Introduction

Parton distribution functions (PDFs) are crucial ingredients for all predictions of physical observables at hadron colliders such as the LHC, and efforts to push their uncertainties to smaller values are becoming increasingly relevant. As a matter of fact, they are one of the dominant sources of uncertainties in precision measurements. As a matter of fact, one of the main differences between a fit with 100 and 1000 Monte Carlo replicas is that correlations between PDFs are reproduced more accurately in the latter [6]. Having to deal with a large ensemble of replicas when producing phenomenological studies is not ideal. To address this issue, a compression methodology that reduces the original Monte Carlo PDF set into a smaller subset was introduced in Ref. Despite the fact that the techniques described in this paper might be generalizable to produce larger PDF sets, we emphasize that our main goal is to provide a technique for minimizing the information loss due to the compression of larger into smaller sets.

Compression: methodological review
How to GAN PDFs?
Introduction to GANs for PDFs
Challenges in training GANs
The ganpdfs methodology
The GAN-enhanced compressor
Results
Validation of the GAN-compressor
Performance of the GAN-enhanced compressor
Generalization capability of GANs for PDFs
Conclusions
A Benchmark of pycompressor against compressor

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.