Analysis Facilities for the HL-LHC White Paper

  • Abstract
  • Literature Map
  • References
  • Similar Papers
Abstract
Translate article icon Translate Article Star icon
Take notes icon Take Notes

This white paper presents the current status of the R&D for Analysis Facilities (AFs) and attempts to summarize the views on the future direction of these facilities. These views have been collected through the High Energy Physics (HEP) Software Foundation’s (HSF) Analysis Facilities forum (HSF Analysis Facilities Forum), established in March 2022, the Analysis Ecosystems II workshop (Analysis Ecosystems Workshop II), that took place in May 2022, and the WLCG/HSF pre-CHEP workshop (WLCG–HSF pre-CHEP Workshop), that took place in May 2023. The paper attempts to cover all the aspects of an analysis facility.

ReferencesShowing 10 of 17 papers
  • Open Access Icon
  • Cite Count Icon 13
  • 10.1088/1742-6596/219/5/052007
The NAF: National analysis facility at DESY
  • Apr 1, 2010
  • Journal of Physics: Conference Series
  • Andreas Haupt + 1 more

  • Open Access Icon
  • PDF Download Icon
  • 10.1016/j.cpc.2023.108965
Prototyping a ROOT-based distributed analysis workflow for HL-LHC: The CMS use case
  • Oct 16, 2023
  • Computer Physics Communications
  • Tommaso Tedeschi + 7 more

  • Open Access Icon
  • Cite Count Icon 118
  • 10.1007/s41781-019-0026-3
Rucio: Scientific Data Management
  • Aug 9, 2019
  • Computing and Software for Big Science
  • Martin Barisits + 29 more

  • Cite Count Icon 1629
The similarity of the primary structure and homology of rhodopsin, beta-adrenoreceptor and muscarinic cholinoceptor
  • Feb 1, 1988
  • Zhurnal evoliutsionnoi biokhimii i fiziologii
  • M B Tsendina + 3 more

  • Open Access Icon
  • Cite Count Icon 109
  • 10.1145/3486897
Methods included
  • May 20, 2022
  • Communications of the ACM
  • Michael R Crusoe + 10 more

  • Open Access Icon
  • Cite Count Icon 41
  • 10.1016/j.future.2016.11.035
SWAN: A service for interactive analysis in the cloud
  • Dec 6, 2016
  • Future Generation Computer Systems
  • Danilo Piparo + 5 more

  • Open Access Icon
  • PDF Download Icon
  • Cite Count Icon 32
  • 10.1051/epjconf/201921406034
REANA: A System for Reusable Research Data Analyses
  • Jan 1, 2019
  • EPJ Web of Conferences
  • Tibor Šimko + 4 more

  • Open Access Icon
  • Cite Count Icon 19
  • 10.1051/epjconf/202024502030
Evolution of the ROOT Tree I/O
  • Jan 1, 2020
  • EPJ Web of Conferences
  • Jakob Blomer + 9 more

  • Open Access Icon
  • PDF Download Icon
  • Cite Count Icon 1
  • 10.1051/epjconf/202429511007
ARMing HEP for the future Energy Efficiency of WLCG sites (ARM vs. x86)
  • Jan 1, 2024
  • EPJ Web of Conferences
  • Emanuele Simili + 5 more

  • Open Access Icon
  • PDF Download Icon
  • Cite Count Icon 1183
  • 10.12688/f1000research.29032.2
Sustainable data analysis with Snakemake
  • Apr 19, 2021
  • F1000Research
  • Felix Mölder + 14 more

Similar Papers
  • Single Report
  • Cite Count Icon 7
  • 10.2172/1437300
HEP Software Foundation Community White Paper Working Group - Detector Simulation
  • Mar 12, 2018
  • J Apostolakis

A working group on detector simulation was formed as part of the high-energy physics (HEP) Software Foundation's initiative to prepare a Community White Paper that describes the main software challenges and opportunities to be faced in the HEP field over the next decade. The working group met over a period of several months in order to review the current status of the Full and Fast simulation applications of HEP experiments and the improvements that will need to be made in order to meet the goals of future HEP experimental programmes. The scope of the topics covered includes the main components of a HEP simulation application, such as MC truth handling, geometry modeling, particle propagation in materials and fields, physics modeling of the interactions of particles with matter, the treatment of pileup and other backgrounds, as well as signal processing and digitisation. The resulting work programme described in this document focuses on the need to improve both the software performance and the physics of detector simulation. The goals are to increase the accuracy of the physics models and expand their applicability to future physics programmes, while achieving large factors in computing performance gains consistent with projections on available computing resources.

  • Single Report
  • 10.2172/5337442
Summaries of FY 1977, research in high energy physics
  • Oct 1, 1977
  • Department Of Energy, Washington, D.C (Usa) Div Of High Energy And Nuclear Physics

The U.S. Department of Energy, through the Office of Energy Research and the Division of High Energy and Nuclear Physics, provides approximately 90% of the total federal support for high energy physics research effort in the United States. The High Energy Physics Program primarily utilizes four major U.S. high energy accelerator facilities and over 50 universities under contract to do experimental and theoretical investigations on the properties, structure and transformation of matter and energy in their most basic forms. This compilation of research summaries is intended to present a convenient report of the scope and nature of high energy physics research presently funded by the U.S. Department of Energy. The areas covered include conception, design, construction, and operation of particle accelerators; experimental research using the accelerators and ancillary equipment; theoretical research; and research and development programs to advance accelerator technology, particle detector systems, and data analysis capabilities. Major concepts and experimental facts in high energy physics have recently been discovered which have the promise of unifying the fundamental forces and of understanding the basic nature of matter and energy. The summaries contained in this document were reproduced in essentially the form submitted by contractors as of January 1977.

  • Research Article
  • 10.1088/1742-6596/396/6/062018
Data Preservation and Long Term Analysis in High Energy Physics
  • Dec 13, 2012
  • Journal of Physics: Conference Series
  • D M South

Several important and unique experimental high-energy physics programmes at a variety of facilities are coming to an end, including those at HERA, the B-factories and the Tevatron. The wealth of physics data from these experiments is the result of a significant financial and human effort, and yet until recently no coherent strategy existed for data preservation and re-use. To address this issue, an inter-experimental study group on data preservation and long-term analysis in high-energy physics was convened at the end of 2008, publishing an interim report in 2009. The membership of the study group has since expanded, including the addition of the LHC experiments, and a full status report has now been released. This report greatly expands on the ideas contained in the original publication and provides a more solid set of recommendations, not only concerning data preservation and its implementation in high-energy physics, but also the future direction and organisational model of the study group. The main messages of the status report were presented for the first time at the 2012 International Conference on Computing in High Energy and Nuclear Physics and are summarised in these proceedings.

  • Conference Article
  • 10.1109/ispa.2012.86
Supercomputing and High Energy Physics in UNAM
  • Jul 1, 2012
  • Jose Luis Gordillo

The purpose of this proposal is to describe highend computing, storage, networking and grid computing grid infrastructures in UNAM, and how high energy and particle physics research groups have promoted and used them. National University of Mexico (UNAM) has a long tradition in High Performance Computing. Since 1991, both centralized and group efforts have been conducted to provide high performance computers for several areas of research. For the last five years, the main supercomputer at UNAM has been KanBalam, which is a 342-node cluster with 130 TBytes storage and a performance of 7.13 TFlop/s. In a few months, UNAM will update its general-purpose supercomputing infrastructure with a new 100 TFlop/s. cluster. Staff for centralized supercomputing services has maintained close collaborations with high energy and particle physics Mexican communities, including HAWC, Auger and ALICE groups. Several projects and activities are the result of these collaborations, including the Latin American grid projects EELA-1, EELA-2 and GISELA. Since last year, UNAM is deploying infrastructure and human resources to become a T1 node of the ALICE-Grid.

  • Preprint Article
  • Cite Count Icon 1
  • 10.6084/m9.figshare.793816.v3
White Paper on DOE-HEP Accelerator Modeling Science Activities
  • Jun 27, 2017
  • Jean‐Luc Vay + 6 more

Author(s): Vay, Jean-Luc; Geddes, Cameron GR; Koniges, Alice; Friedman, Alex; Grote, David P; Bruhwiler, David L; Verboncoeur, John P | Abstract: Toward the goal of maximizing the impact of computer modeling on the design of future particle accelerators and the development of new accelerator techniques a technologies, this white paper presents the rationale for: (a) strengthening and expanding programmatic activities in accelerator modeling science within the Department of Energy (DOE) Office of High Energy Physics (HEP) and (b) increasing the community-wide coordination and integration of code development.

  • Research Article
  • Cite Count Icon 1
  • 10.1088/1742-6596/2438/1/012063
Software Training in High Energy Physics
  • Feb 1, 2023
  • Journal of Physics: Conference Series
  • Michel H Villanueva + 2 more

Among the upgrades in current high energy physics (HEP) experiments and the new facilities coming online, solving software challenges has become integral to the success of the collaborations. The demand for human resources skilled in both HEP and software domains is increasing. With a highly distributed environment in human resources, the sustainability of the HEP ecosystem requires a continuous effort in the equipment of physicists with the required abilities in software development. In this paper, the collective software training program in HEP and its activities led by the HEP Software Foundation (HSF) and the Institute for Research and Innovation in Software in HEP (IRIS-HEP) are presented. Experiment-agnostic, open, and accessible modules for training have been developed, focusing on common software material with ranges from core software skills needed by everyone to advanced training required to produce high-quality sustainable software. A basic software curriculum was built, and an introductory software training event has been prepared to serve HEP entrants. This program serves individuals with transferable skills that are becoming increasingly important to careers in the realm of software and computing, whether inside or outside HEP.

  • Single Report
  • Cite Count Icon 1
  • 10.2172/1436702
HEP Software Foundation Community White Paper Working Group - Data Analysis and Interpretation
  • Apr 9, 2018
  • Lothar Bauerdick

At the heart of experimental high energy physics (HEP) is the development of facilities and instrumentation that provide sensitivity to new phenomena. Our understanding of nature at its most fundamental level is advanced through the analysis and interpretation of data from sophisticated detectors in HEP experiments. The goal of data analysis systems is to realize the maximum possible scientific potential of the data within the constraints of computing and human resources in the least time. To achieve this goal, future analysis systems should empower physicists to access the data with a high level of interactivity, reproducibility and throughput capability. As part of the HEP Software Foundation Community White Paper process, a working group on Data Analysis and Interpretation was formed to assess the challenges and opportunities in HEP data analysis and develop a roadmap for activities in this area over the next decade. In this report, the key findings and recommendations of the Data Analysis and Interpretation Working Group are presented.

  • Research Article
  • Cite Count Icon 7
  • 10.1016/j.cpc.2023.109027
EasyScan_HEP: A tool for connecting programs to scan the parameter space of physics models
  • Nov 29, 2023
  • Computer Physics Communications
  • Liangliang Shang + 1 more

EasyScan_HEP: A tool for connecting programs to scan the parameter space of physics models

  • Research Article
  • Cite Count Icon 4
  • 10.1007/s41781-021-00069-9
Software Training in HEP
  • Jan 1, 2021
  • Computing and Software for Big Science
  • Sudhir Malik + 46 more

The long-term sustainability of the high-energy physics (HEP) research software ecosystem is essential to the field. With new facilities and upgrades coming online throughout the 2020s, this will only become increasingly important. Meeting the sustainability challenge requires a workforce with a combination of HEP domain knowledge and advanced software skills. The required software skills fall into three broad groups. The first is fundamental and generic software engineering (e.g., Unix, version control, C++, and continuous integration). The second is knowledge of domain-specific HEP packages and practices (e.g., the ROOT data format and analysis framework). The third is more advanced knowledge involving specialized techniques, including parallel programming, machine learning and data science tools, and techniques to maintain software projects at all scales. This paper discusses the collective software training program in HEP led by the HEP Software Foundation (HSF) and the Institute for Research and Innovation in Software in HEP (IRIS-HEP). The program equips participants with an array of software skills that serve as ingredients for the solution of HEP computing challenges. Beyond serving the community by ensuring that members are able to pursue research goals, the program serves individuals by providing intellectual capital and transferable skills important to careers in the realm of software and computing, inside or outside HEP.

  • Book Chapter
  • 10.1007/978-3-030-28061-1_16
Using Hadoop for High Energy Physics Data Analysis
  • Jan 1, 2019
  • Qiulan Huang + 5 more

With the development of the new generation of High Energy Physics (HEP) experiments, huge amounts of data are being generated. Efficient parallel algorithms/frameworks and High IO throughput are key to meet the scalability and performance requirements of HEP offline data analysis. Though Hadoop has gained a lot of attention from scientific community for its scalability and parallel computing framework for large data sets, it’s still difficult to make HEP data processing tasks run directly on Hadoop. In this paper we investigate the application of Hadoop to make HEP jobs run on it transparently. Particularly, we discuss a new mechanism to support HEP software to random access data in HDFS. Because HDFS is streaming data stored only supporting sequential write and append. It cannot satisfy HEP jobs to random access data. This new feature allows the Map/Reduce tasks to random read/write on the local file system on data nodes instead of using Hadoop data streaming interface. This makes HEP jobs run on Hadoop possible. We also develop diverse MapReduce model for HEP jobs such as Corsika simulation, ARGO detector simulation and Medea++ reconstruction. And we develop a toolkit for users to submit/query/remove jobs. In addition, we provide cluster monitoring and account system to benefit to the system availability. This work has been in production for HEP experiment to gain about 40,000 CPU hours per month since September, 2016.

  • Research Article
  • 10.1063/pt.3.3636
Yoshio Yamaguchi
  • Jul 1, 2017
  • Physics Today
  • Norisuke Sakai

Yoshio Yamaguchi, an icon of postwar Japanese high-energy physics and a strong advocate of international collaboration, died of pneumonia on 12 August 2016 in Tokyo. Yoshio Yamaguchi AKIKO YAMAGUCHI, 2010PPT|High resolutionBorn on 29 January 1926 in Takefu, Japan, Yamaguchi received his bachelor’s degree in physics in 1947 and a doctor of science degree in physics in 1953 from the University of Tokyo. His dissertation was entitled “Phenomenological analysis of meson processes.”In 1949 Yamaguchi joined Osaka City University as a cofounder of its particle-physics theory group. His proposal that the newly discovered particles in cosmic-ray collisions are created in pairs led to the eventual understanding of strange particles. Yamaguchi started his international career in 1953 at the University of Illinois at Urbana-Champaign; during his two years there, he proposed a separable nuclear potential.In 1957 Yamaguchi was invited to the theory division at CERN. At the time, CERN was in its infancy, and the newly created theory division was attracting numerous superb physicists, with whom Yamaguchi had deep and fruitful interactions and developed close, lasting friendships. His experience during that period greatly influenced Yamaguchi’s passion for international collaboration throughout the rest of his life.Yamaguchi played a vital role in evaluating and consulting on experimental programs at CERN. According to local legend, anyone seeking to get a new research experiment approved first had to convince him to support it. His four years at the organization convinced him that experimental study and verification by means of accelerators were crucial to advancing high-energy physics. While at CERN, he also laid the groundwork for a theory on SU(3) and other flavor symmetries in particle physics.After leaving CERN, Yamaguchi went to the Institute for Nuclear Study (INS) at the University of Tokyo as head of the theory group. In 1968 he moved to the university’s physics department, where he taught and supervised many students. He was not only an excellent researcher but also a dedicated educator. His enthusiasm for and deep knowledge of high-energy physics inspired many young scientists. As one of his students, I witnessed lively discussions every week during and after his High Energy Physics lectures, which attracted many students and staff members.Yamaguchi strongly believed that significant progress can be achieved only if experimental and theoretical physics researchers work hand in hand. He made great efforts to establish experimental high-energy physics by introducing high-energy proton accelerators in Japan. He was instrumental in the creation of the National Laboratory for High Energy Physics and of KEK, the High Energy Accelerator Research Organization.An excellent manager, Yamaguchi returned to the INS as director in 1983 until his retirement in 1986. He was a cofounder and chair of the International Committee for Future Accelerators, which promoted a worldwide network of collaborations. As a member of the Scientific Policy Committee at CERN, he was responsible for, among other things, further promoting international collaborations. He served as president of the Physical Society of Japan in 1986–87. In 1993 he became the first person from Japan to be elected president of the International Union of Pure and Applied Physics. Yamaguchi contributed a great deal to the creation of the Association of Asia Pacific Physical Societies and the Asia Pacific Center for Theoretical Physics.Yamaguchi had a tasteful knowledge of the culture of Japan and the world, especially of the classical period. He often impressed colleagues with his memory of his many intellectual conversations with them. His talks were full of wit and humor, which was somewhat exceptional for a Japanese person. Some years ago his friends and students had a chance to listen to his stories about physics and physicists from the early days of postwar Japan; the stories were so vivid that they left a lasting impression. Many of us pray sincerely that Yamaguchi’s hope and enthusiasm for international collaboration and high-energy physics will inspire the coming generation of researchers in Japan and around the world.© 2017 American Institute of Physics.

  • Research Article
  • Cite Count Icon 63
  • 10.1016/j.jheap.2022.08.001
High-energy and ultra-high-energy neutrinos: A Snowmass white paper
  • Aug 8, 2022
  • Journal of High Energy Astrophysics
  • Markus Ackermann + 5 more

Astrophysical neutrinos are excellent probes of astroparticle physics and high-energy physics. With energies far beyond solar, supernovae, atmospheric, and accelerator neutrinos, high-energy and ultra-high-energy neutrinos probe fundamental physics from the TeV scale to the EeV scale and beyond. They are sensitive to physics both within and beyond the Standard Model through their production mechanisms and in their propagation over cosmological distances. They carry unique information about their extreme non-thermal sources by giving insight into regions that are opaque to electromagnetic radiation. This white paper describes the opportunities astrophysical neutrino observations offer for astrophysics and high-energy physics, today and in coming years.

  • Single Report
  • Cite Count Icon 2
  • 10.2172/1882567
Data Science and Machine Learning in Education
  • Jul 19, 2022
  • G Benelli + 22 more

The growing role of data science (DS) and machine learning (ML) in high-energy physics (HEP) is well established and pertinent given the complex detectors, large data, sets and sophisticated analyses at the heart of HEP research. Moreover, exploiting symmetries inherent in physics data have inspired physics-informed ML as a vibrant sub-field of computer science research. HEP researchers benefit greatly from materials widely available materials for use in education, training and workforce development. They are also contributing to these materials and providing software to DS/ML-related fields. Increasingly, physics departments are offering courses at the intersection of DS, ML and physics, often using curricula developed by HEP researchers and involving open software and data used in HEP. In this white paper, we explore synergies between HEP research and DS/ML education, discuss opportunities and challenges at this intersection, and propose community activities that will be mutually beneficial.

  • Research Article
  • Cite Count Icon 16
  • 10.1016/j.nima.2005.10.057
Energy recovery linacs in high-energy and nuclear physics
  • Nov 28, 2005
  • Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment
  • Ilan Ben-Zvi + 3 more

Energy recovery linacs in high-energy and nuclear physics

  • Research Article
  • Cite Count Icon 1
  • 10.1016/j.cpc.2007.02.002
Networking for High Energy and Nuclear Physics
  • Feb 8, 2007
  • Computer Physics Communications
  • Harvey B Newman

Networking for High Energy and Nuclear Physics

More from: Computing and Software for Big Science
  • New
  • Research Article
  • 10.1007/s41781-025-00148-1
Enforcing Fundamental Relations via Adversarial Attacks on Input Parameter Correlations
  • Nov 5, 2025
  • Computing and Software for Big Science
  • Lucie Flek + 7 more

  • Research Article
  • 10.1007/s41781-025-00146-3
Application of Geometric Deep Learning for Tracking of Hyperons in a Straw Tube Detector
  • Oct 21, 2025
  • Computing and Software for Big Science
  • Adeel Akram + 5 more

  • Research Article
  • 10.1007/s41781-025-00133-8
Analysis Facilities for the HL-LHC White Paper
  • Jul 13, 2025
  • Computing and Software for Big Science
  • D Ciangottini + 65 more

  • Research Article
  • 10.1007/s41781-025-00143-6
Performance Portability of the Particle Tracking Algorithm Using SYCL
  • Jul 1, 2025
  • Computing and Software for Big Science
  • Bartosz Soból + 3 more

  • Research Article
  • 10.1007/s41781-025-00142-7
PhyLiNO: a forward-folding likelihood-fit framework for neutrino oscillation physics
  • Jul 1, 2025
  • Computing and Software for Big Science
  • Denise Hellwig + 4 more

  • Research Article
  • 10.1007/s41781-025-00140-9
SymbolFit: Automatic Parametric Modeling with Symbolic Regression
  • Jul 1, 2025
  • Computing and Software for Big Science
  • Ho Fung Tsoi + 8 more

  • Research Article
  • 10.1007/s41781-025-00141-8
A Downstream and Vertexing Algorithm for Long Lived Particles (LLP) Selection at the First High Level Trigger (HLT1) of LHCb
  • Jul 1, 2025
  • Computing and Software for Big Science
  • V Kholoimov + 4 more

  • Research Article
  • 10.1007/s41781-025-00137-4
oidc-agent - Integrating OpenID Connect Tokens with the Command Line
  • May 22, 2025
  • Computing and Software for Big Science
  • Gabriel Zachmann + 2 more

  • Research Article
  • 10.1007/s41781-025-00138-3
KAN We Improve on HEP Classification Tasks? Kolmogorov–Arnold Networks Applied to an LHC Physics Example
  • May 22, 2025
  • Computing and Software for Big Science
  • Johannes Erdmann + 2 more

  • Research Article
  • 10.1007/s41781-025-00139-2
An automated bandwidth division for the LHCb upgrade trigger
  • May 21, 2025
  • Computing and Software for Big Science
  • T Evans + 2 more

Save Icon
Up Arrow
Open/Close
  • Ask R Discovery Star icon
  • Chat PDF Star icon

AI summaries and top papers from 250M+ research sources.

Search IconWhat is the difference between bacteria and viruses?
Open In New Tab Icon
Search IconWhat is the function of the immune system?
Open In New Tab Icon
Search IconCan diabetes be passed down from one generation to the next?
Open In New Tab Icon