Abstract
Building, testing and deploying of coherent large software stacks is very challenging, in particular when they consist of the diverse set of packages required by the LHC experiments, the CERN Beams Department and data analysis services such as SWAN. These software stacks include several packages (Grid middleware, Monte Carlo generators, Machine Learning tools, Python modules) all available for a large number of compilers, operating systems and hardware architectures. To address this challenge, we developed an infrastructure around a tool called lcgcmake. Dedicated modules are responsible for building the packages, con-trolling the dependencies in a reliable and scalable way. The distribution relies on a robust and automatic system, responsible for building and testing the packages, installing them on CernVM-FS and packaging the binaries in RPMs and tarballs. This system is orchestrated through Jenkins on build machines provided by the CERN Openstack facility. The results are published through user-friendly web pages. In this paper we will present an overview of these infrastructure tools and policies. We also discuss the role of this effort within the HEP Software Foundation (HSF). Finally we will discuss the evolution of the infrastructure towards container (Docker) technologies and the future directions and challenges of the project.
Highlights
Modern physics experiments require complex software stacks to build the experiment specific applications
In this paper we present and discuss the way the CERN EP-SFT [1] group has addressed this task for the needs of ATLAS [2], LHCb [3], SWAN [4], FCC [5] and the CERN Beams Department [6]
In this paper we have described the main aspects of the system developed and used by CERN EP-SFT to provide software stacks to a diverse community of physicist
Summary
Modern physics experiments require complex software stacks to build the experiment specific applications. Stability and reproducibility usually imply a conservative approach in the choice of the main components; the constant need to integrate new developments and versions to improve performances and usability makes the provision of coherent large software stacks a dynamic and challenging task. In this paper we present and discuss the way the CERN EP-SFT [1] group has addressed this task for the needs of ATLAS [2], LHCb [3], SWAN [4], FCC [5] and the CERN Beams Department [6]. In the rest of this section we will introduce some key concept and related terminology.
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.