Abstract
Analyses at the LHC which search for rare physics processes or determine with high precision Standard Model parameters require accurate simulations of the detector response and the event selection processes. The accurate determination of the trigger response is crucial for the determination of overall selection efficiencies and signal sensitivities. For the generation and the reconstruction of simulated event data, the most recent software releases are usually used to ensure the best agreement between simulated data and real data. For the simulation of the trigger selection process, however, ideally the same software release that was deployed when the real data were taken should be used. This potentially requires running software dating many years back. Having a strategy for running old software in a modern environment thus becomes essential when data simulated for past years start to present a sizable fraction of the total.We examined the requirements and possibilities for such a simulation scheme within the ATLAS software framework and successfully implemented a proof-of-concept simulation chain. One of the greatest challenges was the choice of a data format which promises long term compatibility with old and new software releases. Over the time periods envisaged, data format incompatibilities are also likely to emerge in databases and other external support services. Software availability may become an issue, when e.g. the support for the underlying operating system might stop. In this paper we present the encountered problems and developed solutions, and discuss proposals for future development. Some ideas reach beyond the retrospective trigger simulation scheme in ATLAS as they also touch more generally aspects of data preservation.
Highlights
Proposal: Use for the trigger simulation of past data taking periods the same trigger software releases which were deployed for these data taking periods
Proposed strategy should require minimal to no work on already existing releases
Hardware abstraction and preservation of software environment Introduces computational and resource overhead Need to foresee patch releases to adapt to changing external infrastructure services e.g. for data input/output services or for changing database technologies
Summary
Changed selection criteria e.g. due to higher luminosities New trigger algorithms and new trigger lines. How to achieve this for past data taking periods ? Re-simulation of the trigger response may be necessary because improved event generators are deployed an improved detector description or event reconstruction is available the statistics of the MC samples needs to be increased the response of new physics processes to the trigger selection needs to be studied How to achieve this for past data taking periods ? Re-simulation of the trigger response may be necessary because improved event generators are deployed an improved detector description or event reconstruction is available the statistics of the MC samples needs to be increased the response of new physics processes to the trigger selection needs to be studied
Talk to us
Join us for a 30 min session where you can share your feedback and ask us any queries you have
Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.