Cargando…

New challenges for distributed computing at the CMS experiment

The Large Hadron Collider (LHC) experiments soon step into the next period of run-3 data-taking with an increased data rate and high pileup requiring an excellent working computing infrastructure. In the future High-Luminosity LHC (HL-LHC) data-taking period, the compute, storage and network facilit...

Descripción completa

Detalles Bibliográficos
Autor principal: Krammer, Natascha
Lenguaje:eng
Publicado: 2020
Materias:
Acceso en línea:http://cds.cern.ch/record/2780118
_version_ 1780971851152883712
author Krammer, Natascha
author_facet Krammer, Natascha
author_sort Krammer, Natascha
collection CERN
description The Large Hadron Collider (LHC) experiments soon step into the next period of run-3 data-taking with an increased data rate and high pileup requiring an excellent working computing infrastructure. In the future High-Luminosity LHC (HL-LHC) data-taking period, the compute, storage and network facilities have to be further extended by large factors and flexible and sophisticated computing models are essential. New techniques of modern state-of-the-art methods in physics analysis and data science, Deep Learning and Big Data tools, are crucial to handle high-dimensional and more complex problems. Beside flexible cloud computing technologies the usage of High Performance Computing (HPC) at the LHC experiments are explored. In this presentation, I will discuss the LHC run-3 and future HL-LHC runs computing technologies and the utilisation of modern physics analysis and data science methods for the increasing and complex demands of large scale scientific computing.
id cern-2780118
institution Organización Europea para la Investigación Nuclear
language eng
publishDate 2020
record_format invenio
spelling cern-27801182021-09-06T19:04:53Zhttp://cds.cern.ch/record/2780118engKrammer, NataschaNew challenges for distributed computing at the CMS experimentDetectors and Experimental TechniquesThe Large Hadron Collider (LHC) experiments soon step into the next period of run-3 data-taking with an increased data rate and high pileup requiring an excellent working computing infrastructure. In the future High-Luminosity LHC (HL-LHC) data-taking period, the compute, storage and network facilities have to be further extended by large factors and flexible and sophisticated computing models are essential. New techniques of modern state-of-the-art methods in physics analysis and data science, Deep Learning and Big Data tools, are crucial to handle high-dimensional and more complex problems. Beside flexible cloud computing technologies the usage of High Performance Computing (HPC) at the LHC experiments are explored. In this presentation, I will discuss the LHC run-3 and future HL-LHC runs computing technologies and the utilisation of modern physics analysis and data science methods for the increasing and complex demands of large scale scientific computing.CMS-CR-2020-101oai:cds.cern.ch:27801182020-04-27
spellingShingle Detectors and Experimental Techniques
Krammer, Natascha
New challenges for distributed computing at the CMS experiment
title New challenges for distributed computing at the CMS experiment
title_full New challenges for distributed computing at the CMS experiment
title_fullStr New challenges for distributed computing at the CMS experiment
title_full_unstemmed New challenges for distributed computing at the CMS experiment
title_short New challenges for distributed computing at the CMS experiment
title_sort new challenges for distributed computing at the cms experiment
topic Detectors and Experimental Techniques
url http://cds.cern.ch/record/2780118
work_keys_str_mv AT krammernatascha newchallengesfordistributedcomputingatthecmsexperiment