Cargando…

Development, Validation and Integration of the ATLAS Trigger System Software in Run 2

The trigger system of the ATLAS detector at the LHC is a combination of hardware, firmware and software, associated to various sub-detectors that must seamlessly cooperate in order to select 1 collision of interest out of every 40,000 delivered by the LHC every millisecond. This talk will discuss th...

Descripción completa

Detalles Bibliográficos
Autores principales: Keyes, Robert, Vazquez Schroeder, Tamara, George, Simon
Lenguaje:eng
Publicado: 2016
Materias:
Acceso en línea:http://cds.cern.ch/record/2221689
_version_ 1780952242346524672
author Keyes, Robert
Vazquez Schroeder, Tamara
George, Simon
author_facet Keyes, Robert
Vazquez Schroeder, Tamara
George, Simon
author_sort Keyes, Robert
collection CERN
description The trigger system of the ATLAS detector at the LHC is a combination of hardware, firmware and software, associated to various sub-detectors that must seamlessly cooperate in order to select 1 collision of interest out of every 40,000 delivered by the LHC every millisecond. This talk will discuss the challenges, workflow and organization of the ongoing trigger software development, validation and deployment. This development, from the top level integration and configuration to the individual components responsible for each sub system, is done to ensure that the most up to date algorithms are used to optimize the performance of the experiment. This optimization hinges on the reliability and predictability of the software performance, which is why validation is of the utmost importance. The software adheres to a hierarchical release structure, with newly validated releases propagating upwards. Integration tests are carried out on a daily basis to ensure that the releases deployed to the online trigger farm during data taking run as desired. Releases at all levels are validated by fully reconstructing the data from the raw files of a benchmark run, mimicking the reconstruction that occurs during normal data taking. This exercise is computationally demanding and thus runs on the ATLAS high performance computing grid with high priority. Performance metrics ranging from low level memory and CPU requirements, to shapes and efficiencies of high level physics quantities are visualized and validated by a range of experts. This is a multifaceted critical task that ties together many aspects of the experimental effort that directly influences the overall performance of the ATLAS experiment.
id cern-2221689
institution Organización Europea para la Investigación Nuclear
language eng
publishDate 2016
record_format invenio
spelling cern-22216892019-09-30T06:29:59Zhttp://cds.cern.ch/record/2221689engKeyes, RobertVazquez Schroeder, TamaraGeorge, SimonDevelopment, Validation and Integration of the ATLAS Trigger System Software in Run 2Particle Physics - ExperimentThe trigger system of the ATLAS detector at the LHC is a combination of hardware, firmware and software, associated to various sub-detectors that must seamlessly cooperate in order to select 1 collision of interest out of every 40,000 delivered by the LHC every millisecond. This talk will discuss the challenges, workflow and organization of the ongoing trigger software development, validation and deployment. This development, from the top level integration and configuration to the individual components responsible for each sub system, is done to ensure that the most up to date algorithms are used to optimize the performance of the experiment. This optimization hinges on the reliability and predictability of the software performance, which is why validation is of the utmost importance. The software adheres to a hierarchical release structure, with newly validated releases propagating upwards. Integration tests are carried out on a daily basis to ensure that the releases deployed to the online trigger farm during data taking run as desired. Releases at all levels are validated by fully reconstructing the data from the raw files of a benchmark run, mimicking the reconstruction that occurs during normal data taking. This exercise is computationally demanding and thus runs on the ATLAS high performance computing grid with high priority. Performance metrics ranging from low level memory and CPU requirements, to shapes and efficiencies of high level physics quantities are visualized and validated by a range of experts. This is a multifaceted critical task that ties together many aspects of the experimental effort that directly influences the overall performance of the ATLAS experiment.ATL-DAQ-SLIDE-2016-762oai:cds.cern.ch:22216892016-10-04
spellingShingle Particle Physics - Experiment
Keyes, Robert
Vazquez Schroeder, Tamara
George, Simon
Development, Validation and Integration of the ATLAS Trigger System Software in Run 2
title Development, Validation and Integration of the ATLAS Trigger System Software in Run 2
title_full Development, Validation and Integration of the ATLAS Trigger System Software in Run 2
title_fullStr Development, Validation and Integration of the ATLAS Trigger System Software in Run 2
title_full_unstemmed Development, Validation and Integration of the ATLAS Trigger System Software in Run 2
title_short Development, Validation and Integration of the ATLAS Trigger System Software in Run 2
title_sort development, validation and integration of the atlas trigger system software in run 2
topic Particle Physics - Experiment
url http://cds.cern.ch/record/2221689
work_keys_str_mv AT keyesrobert developmentvalidationandintegrationoftheatlastriggersystemsoftwareinrun2
AT vazquezschroedertamara developmentvalidationandintegrationoftheatlastriggersystemsoftwareinrun2
AT georgesimon developmentvalidationandintegrationoftheatlastriggersystemsoftwareinrun2