Abstract
Although the LHC experiments have been designed and prepared since 1984, the challenge of LHC computing was only tackled seriously much later, at the end of the ‘90s. This was the time at which the Grid paradigm wasemerging, and LHC computing had great hopes that most of its challenges would be solved by this new paradigm. The path to having functional and efficient distributed computing systems was in the end much more complex than anticipated. However, most obstacles were overcome, thanks to the introductionof new paradigms and a lot of manpower investment from the experiments and from the supporting IT units (for middleware development and infrastructuresetup). This contribution is briefly outlining some of the biggest hopes and disillusions of these past 20 years, and gives a brief outlook to the coming trends.
Highlights
The LHC computing challenges only started to be considered seriously long after detectors had been designed and their construction initiated
A review of the computing needs for LHC experiments had been conducted at the end of the ‘90s that concluded that the computing resources required to handle such large datasets were far beyond what could be provided at a single location and funded by a single agency [1]
The LCG is much more a coordinating body, as middleware was supposed to be developed in the external projects (EU DataGrid to start with, soon followed by EGEE (Enabling Grid for E-sciencE), EMI (European Middleware Initiative)); deployment is ensured by the sites themselves, but clearly this had to be coordinated such that all sites efficiently cooperate; the fabric area was meant to cover in particular the CERN fabric, due to its central role where data are produced; the applications area was coordinating the experiments’ efforts as well as common software developments such as ROOT or other packages used by several of the experiments
Summary
The LHC computing challenges only started to be considered seriously long after detectors had been designed and their construction initiated. Fortran that had been the language widely used in HEP experiments until was no longer suitable for handling the complexity of these new experiments, and the turn had been taken in the mid90’s to move to object-oriented programming. This was not an easy move but despite some resistance, the LHC experiments converged to programming mostly in C++ language. The processing and analysis of the petabytes of data produced every year by the LHC experiments remained though the biggest challenge
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.