Cargando…

ATLAS Software and Computing HL-LHC Roadmap

The High Luminosity phase of the Large Hadron Collider (HL-LHC) is due to start operations in the late 2020s. With the upgraded LHC, the experiments will record data at a rate some 7-10 times greater than today, with the average number of collisions per bunch crossing rising to as much as 200, from...

Descripción completa

Detalles Bibliográficos
Autores principales: CERN. Geneva. The LHC experiments Committee, LHCC
Publicado: 2022
Materias:
Acceso en línea:http://cds.cern.ch/record/2802918
_version_ 1780972764051537920
author CERN. Geneva. The LHC experiments Committee
LHCC
author_facet CERN. Geneva. The LHC experiments Committee
LHCC
author_sort CERN. Geneva. The LHC experiments Committee
collection CERN
description The High Luminosity phase of the Large Hadron Collider (HL-LHC) is due to start operations in the late 2020s. With the upgraded LHC, the experiments will record data at a rate some 7-10 times greater than today, with the average number of collisions per bunch crossing rising to as much as 200, from 30-60 currently. By the end of Run 5, the ATLAS and CMS experiments are expected to have collected ten times more data than they will have recorded during the first three LHC runs. This dataset will unveil a new landscape for particle physicists to explore and offer unparalleled opportunities for understanding the forces of nature at their most basic level. In particular, it will contain prodigious quantities of the heaviest and least well-understood fundamental particles, Higgs bosons and top quarks, allowing the properties of these entities, and their interactions, to be measured with unprecedented precision. It will also allow searches for New Physics, including supersymmetric particles and those predicted by exotic models, to explore regions of phase space that were out of the reach of the lower statistics LHC dataset. Exploring this new landscape will require a simulated data sample that has higher statistics than the data and is capable of modelling Standard Model backgrounds with levels of precision that have not previously been needed, implying the need for generating events at higher orders of perturbative strong and electroweak coupling. It must be able to adequately populate phase spaces such as those occupied by high jet multiplicity events and those probed by New Physics searches. On top of the requirements of processing and storing the data, this will pose a formidable challenge to those responsible for providing the computing resources and writing the software which underpins the research programmes of the experiments. The demands of HL-LHC on the computing and software can be summarised thus: the events (both real and simulated) will be more complex than in the LHC era due to higher activity (in particular track multiplicity), there will be an order of magnitude more of them, and the simulated data must be produced with higher physics fidelity than previously. All of this should be achieved with (at best) a flat annual computing budget. Beyond this, the question of straightforward and efficient data access for individual physicists must also be addressed. At the same time, the computing environment in which high energy physics research must operate is changing rapidly. The cost of disk storage has ceased to fall in recent years. Although tape is significantly cheaper, the relatively small number of vendors reduces the downward pressure on prices through competition. Solid-state technology is still too expensive to be used for bulk data storage, and current trends indicate that this will remain the case into the HL-LHC era. Consequently, it is likely that, to address the storage requirements of HL-LHC, the experiments will need to “do more with less,” that is, ensure that disk resources are mainly used for active tasks, relying more on tape for long term storage. Physics analysts will need to adapt to using more compact data formats. ATLAS produced a Conceptual Design Report (CDR) for HL-LHC Computing during the spring of 2020 for an initial review by the LHCC. The CDR laid out the issues discussed above, and the general approaches that will be taken to address them. This new document serves as a software-focused update to the first CDR, providing more concrete information on development work that will be undertaken in the coming years, listing specific milestones and target dates. Additionally, the document describes how ATLAS collaborates with external activities and projects, and how such collaboration will impact the overall development for HL-LHC.
id cern-2802918
institution Organización Europea para la Investigación Nuclear
publishDate 2022
record_format invenio
spelling cern-28029182023-08-23T15:21:48Zhttp://cds.cern.ch/record/2802918CERN. Geneva. The LHC experiments CommitteeLHCCATLAS Software and Computing HL-LHC RoadmapDetectors and Experimental TechniquesThe High Luminosity phase of the Large Hadron Collider (HL-LHC) is due to start operations in the late 2020s. With the upgraded LHC, the experiments will record data at a rate some 7-10 times greater than today, with the average number of collisions per bunch crossing rising to as much as 200, from 30-60 currently. By the end of Run 5, the ATLAS and CMS experiments are expected to have collected ten times more data than they will have recorded during the first three LHC runs. This dataset will unveil a new landscape for particle physicists to explore and offer unparalleled opportunities for understanding the forces of nature at their most basic level. In particular, it will contain prodigious quantities of the heaviest and least well-understood fundamental particles, Higgs bosons and top quarks, allowing the properties of these entities, and their interactions, to be measured with unprecedented precision. It will also allow searches for New Physics, including supersymmetric particles and those predicted by exotic models, to explore regions of phase space that were out of the reach of the lower statistics LHC dataset. Exploring this new landscape will require a simulated data sample that has higher statistics than the data and is capable of modelling Standard Model backgrounds with levels of precision that have not previously been needed, implying the need for generating events at higher orders of perturbative strong and electroweak coupling. It must be able to adequately populate phase spaces such as those occupied by high jet multiplicity events and those probed by New Physics searches. On top of the requirements of processing and storing the data, this will pose a formidable challenge to those responsible for providing the computing resources and writing the software which underpins the research programmes of the experiments. The demands of HL-LHC on the computing and software can be summarised thus: the events (both real and simulated) will be more complex than in the LHC era due to higher activity (in particular track multiplicity), there will be an order of magnitude more of them, and the simulated data must be produced with higher physics fidelity than previously. All of this should be achieved with (at best) a flat annual computing budget. Beyond this, the question of straightforward and efficient data access for individual physicists must also be addressed. At the same time, the computing environment in which high energy physics research must operate is changing rapidly. The cost of disk storage has ceased to fall in recent years. Although tape is significantly cheaper, the relatively small number of vendors reduces the downward pressure on prices through competition. Solid-state technology is still too expensive to be used for bulk data storage, and current trends indicate that this will remain the case into the HL-LHC era. Consequently, it is likely that, to address the storage requirements of HL-LHC, the experiments will need to “do more with less,” that is, ensure that disk resources are mainly used for active tasks, relying more on tape for long term storage. Physics analysts will need to adapt to using more compact data formats. ATLAS produced a Conceptual Design Report (CDR) for HL-LHC Computing during the spring of 2020 for an initial review by the LHCC. The CDR laid out the issues discussed above, and the general approaches that will be taken to address them. This new document serves as a software-focused update to the first CDR, providing more concrete information on development work that will be undertaken in the coming years, listing specific milestones and target dates. Additionally, the document describes how ATLAS collaborates with external activities and projects, and how such collaboration will impact the overall development for HL-LHC.CERN-LHCC-2022-005LHCC-G-182oai:cds.cern.ch:28029182022-03-03
spellingShingle Detectors and Experimental Techniques
CERN. Geneva. The LHC experiments Committee
LHCC
ATLAS Software and Computing HL-LHC Roadmap
title ATLAS Software and Computing HL-LHC Roadmap
title_full ATLAS Software and Computing HL-LHC Roadmap
title_fullStr ATLAS Software and Computing HL-LHC Roadmap
title_full_unstemmed ATLAS Software and Computing HL-LHC Roadmap
title_short ATLAS Software and Computing HL-LHC Roadmap
title_sort atlas software and computing hl-lhc roadmap
topic Detectors and Experimental Techniques
url http://cds.cern.ch/record/2802918
work_keys_str_mv AT cerngenevathelhcexperimentscommittee atlassoftwareandcomputinghllhcroadmap
AT lhcc atlassoftwareandcomputinghllhcroadmap