Abstract
The Fortran LHAPDF library has been a long-term workhorse in particle physics, providing standardised access to parton density functions for experimental and phenomenological purposes alike, following on from the venerable PDFLIB package. During Run 1 of the LHC, however, several fundamental limitations in LHAPDF’s design have became deeply problematic, restricting the usability of the library for important physics-study procedures and providing dangerous avenues by which to silently obtain incorrect results. In this paper we present the LHAPDF 6 library, a ground-up re-engineering of the PDFLIB/LHAPDF paradigm for PDF access which removes all limits on use of concurrent PDF sets, massively reduces static memory requirements, offers improved CPU performance, and fixes fundamental bugs in multi-set access to PDF metadata. The new design, restricted for now to interpolated PDFs, uses centralised numerical routines and a powerful cascading metadata system to decouple software releases from provision of new PDF data and allow completely general parton content. More than 200 PDF sets have been migrated from LHAPDF 5 to the new universal data format, via a stringent quality control procedure. LHAPDF 6 is supported by many Monte Carlo generators and other physics programs, in some cases via a full set of compatibility routines, and is recommended for the demanding PDF access needs of LHC Run 2 and beyond.
Highlights
Parton density functions (PDFs) are a crucial input into crosssection calculations at hadron colliders; they encode the process-independent momentum structure of partons within hadrons, with which partonic cross-sections must be convolved to obtain physical results that can be compared to experimental data
Many of the problems of LHAPDF 5 stem from the combination of the static nature of Fortran memory handling and from the way that evolving user demands on LHAPDF forced retro-fitting of features such as grid interpolation and multiset mode on to a system not originally designed to incorporate them. These have combined with more logistical features such as the lack of any versioned connection between the PDF data files and the library, the menagerie of interpolation grid formats, and the need to modify the library to use a new PDF to make LHAPDF 5 difficult both to use and to maintain. These issues became critical during Run 1 of the LHC, leading to the development of LHAPDF 6 to deal with the increased demands on parton density usage in Run 2 and beyond
Since the Python scripting language has become widely used in high-energy physics, we provide a Python interface to the C++ LHAPDF library, which can be useful for interactive PDF testing and exploration
Summary
Parton density functions (PDFs) are a crucial input into crosssection calculations at hadron colliders; they encode the process-independent momentum structure of partons within hadrons, with which partonic cross-sections must be convolved to obtain physical results that can be compared to experimental data. The first version of LHAPDF was developed to solve scaling problems with the previously standard PDFLIB library [1], and to retain backward compatibility with it; in this paper we describe a similar evolution within the LHAPDF package, from a Fortran-based static memory paradigm to a C++ one in which dynamic PDF object creation, concurrent usage, and removal of artificial limitations are fundamental This new version addresses the most serious limitations of the Fortran version, permitting a new level complexity of PDF systematics estimation for precision physics studies at the LHC [2] Run 2 and beyond
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.