Cargando…

CMS Computing Resources: Meeting the demands of the high-luminosity LHC physics program

The high-luminosity program has seen numerous extrapolations of its needed computing resources that each indicate the need for substantial changes if the desired HL-LHC physics program is to be supported within the current level of computing resource budgets. Drivers include large increases in event...

Descripción completa

Detalles Bibliográficos
Autores principales: Lange, David, Bloom, Kenneth, Boccali, Tommaso, Gutsche, Oliver, Vaandering, Eric
Lenguaje:eng
Publicado: 2019
Materias:
Acceso en línea:https://dx.doi.org/10.1051/epjconf/201921403055
http://cds.cern.ch/record/2701504
_version_ 1780964600137646080
author Lange, David
Bloom, Kenneth
Boccali, Tommaso
Gutsche, Oliver
Vaandering, Eric
author_facet Lange, David
Bloom, Kenneth
Boccali, Tommaso
Gutsche, Oliver
Vaandering, Eric
author_sort Lange, David
collection CERN
description The high-luminosity program has seen numerous extrapolations of its needed computing resources that each indicate the need for substantial changes if the desired HL-LHC physics program is to be supported within the current level of computing resource budgets. Drivers include large increases in event complexity (leading to increased processing time and analysis data size) and trigger rates needed (5-10 fold increases) for the HL-LHC program. The CMS experiment has recently undertaken an effort to merge the ideas behind short-term and long-term resource models in order to make easier and more reliable extrapolations to future needs. Near term computing resource estimation requirements depend on numerous parameters: LHC uptime and beam intensities; detector and online trigger performance; software performance; analysis data requirements; data access, management, and retention policies; site characteristics; and network performance. Longer term modeling is affected by the same characteristics, but with much larger uncertainties that must be considered to understand the most interesting handles for increasing the "physics per computing dollar" of the HL-LHC. In this presentation, we discuss the current status of long term modeling of the CMS computing resource needs for HL-LHC with emphasis on techniques for extrapolations, uncertainty quantification, and model results. We illustrate potential ways that high-luminosity CMS could accomplish its desired physics program within today's computing budgets.
id oai-inspirehep.net-1760952
institution Organización Europea para la Investigación Nuclear
language eng
publishDate 2019
record_format invenio
spelling oai-inspirehep.net-17609522022-08-10T12:23:37Zdoi:10.1051/epjconf/201921403055http://cds.cern.ch/record/2701504engLange, DavidBloom, KennethBoccali, TommasoGutsche, OliverVaandering, EricCMS Computing Resources: Meeting the demands of the high-luminosity LHC physics programComputing and ComputersThe high-luminosity program has seen numerous extrapolations of its needed computing resources that each indicate the need for substantial changes if the desired HL-LHC physics program is to be supported within the current level of computing resource budgets. Drivers include large increases in event complexity (leading to increased processing time and analysis data size) and trigger rates needed (5-10 fold increases) for the HL-LHC program. The CMS experiment has recently undertaken an effort to merge the ideas behind short-term and long-term resource models in order to make easier and more reliable extrapolations to future needs. Near term computing resource estimation requirements depend on numerous parameters: LHC uptime and beam intensities; detector and online trigger performance; software performance; analysis data requirements; data access, management, and retention policies; site characteristics; and network performance. Longer term modeling is affected by the same characteristics, but with much larger uncertainties that must be considered to understand the most interesting handles for increasing the "physics per computing dollar" of the HL-LHC. In this presentation, we discuss the current status of long term modeling of the CMS computing resource needs for HL-LHC with emphasis on techniques for extrapolations, uncertainty quantification, and model results. We illustrate potential ways that high-luminosity CMS could accomplish its desired physics program within today's computing budgets.FERMILAB-CONF-19-552-SCDoai:inspirehep.net:17609522019
spellingShingle Computing and Computers
Lange, David
Bloom, Kenneth
Boccali, Tommaso
Gutsche, Oliver
Vaandering, Eric
CMS Computing Resources: Meeting the demands of the high-luminosity LHC physics program
title CMS Computing Resources: Meeting the demands of the high-luminosity LHC physics program
title_full CMS Computing Resources: Meeting the demands of the high-luminosity LHC physics program
title_fullStr CMS Computing Resources: Meeting the demands of the high-luminosity LHC physics program
title_full_unstemmed CMS Computing Resources: Meeting the demands of the high-luminosity LHC physics program
title_short CMS Computing Resources: Meeting the demands of the high-luminosity LHC physics program
title_sort cms computing resources: meeting the demands of the high-luminosity lhc physics program
topic Computing and Computers
url https://dx.doi.org/10.1051/epjconf/201921403055
http://cds.cern.ch/record/2701504
work_keys_str_mv AT langedavid cmscomputingresourcesmeetingthedemandsofthehighluminositylhcphysicsprogram
AT bloomkenneth cmscomputingresourcesmeetingthedemandsofthehighluminositylhcphysicsprogram
AT boccalitommaso cmscomputingresourcesmeetingthedemandsofthehighluminositylhcphysicsprogram
AT gutscheoliver cmscomputingresourcesmeetingthedemandsofthehighluminositylhcphysicsprogram
AT vaanderingeric cmscomputingresourcesmeetingthedemandsofthehighluminositylhcphysicsprogram