Cargando…
ATLAS Data Challenge 1
In 2002 the ATLAS experiment started a series of Data Challenges (DC) of which the goals are the validation of the Computing Model, of the complete software suite, of the data model, and to ensure the correctness of the technical choices to be made. A major feature of the first Data Challenge (DC1)...
Autor principal: | |
---|---|
Lenguaje: | eng |
Publicado: |
2003
|
Materias: | |
Acceso en línea: | http://cds.cern.ch/record/621592 |
_version_ | 1780900409202704384 |
---|---|
author | Poulard, G |
author_facet | Poulard, G |
author_sort | Poulard, G |
collection | CERN |
description | In 2002 the ATLAS experiment started a series of Data Challenges (DC) of which the goals are the validation of the Computing Model, of the complete software suite, of the data model, and to ensure the correctness of the technical choices to be made. A major feature of the first Data Challenge (DC1) was the preparation and the deployment of the software required for the production of large event samples for the High Level Trigger (HLT) and physics communities, and the production of those samples as a world-wide distributed activity. The first phase of DC1 was run during summer 2002, and involved 39 institutes in 18 countries. More than 10 million physics events and 30 million single particle events were fully simulated. Over a period of about 40 calendar days 71000 CPU-days were used producing 30 Tbytes of data in about 35000 partitions. In the second phase the next processing step was performed with the participation of 56 institutes in 21 countries (~ 4000 processors used in parallel). The basic elements of the ATLAS Monte Carlo production system are described. We also present how the software suite was validated and the participating sites were certified. These productions were already partly performed by using different flavours of Grid middleware at ~ 20 sites. |
id | cern-621592 |
institution | Organización Europea para la Investigación Nuclear |
language | eng |
publishDate | 2003 |
record_format | invenio |
spelling | cern-6215922019-09-30T06:29:59Zhttp://cds.cern.ch/record/621592engPoulard, GATLAS Data Challenge 1Computing and ComputersIn 2002 the ATLAS experiment started a series of Data Challenges (DC) of which the goals are the validation of the Computing Model, of the complete software suite, of the data model, and to ensure the correctness of the technical choices to be made. A major feature of the first Data Challenge (DC1) was the preparation and the deployment of the software required for the production of large event samples for the High Level Trigger (HLT) and physics communities, and the production of those samples as a world-wide distributed activity. The first phase of DC1 was run during summer 2002, and involved 39 institutes in 18 countries. More than 10 million physics events and 30 million single particle events were fully simulated. Over a period of about 40 calendar days 71000 CPU-days were used producing 30 Tbytes of data in about 35000 partitions. In the second phase the next processing step was performed with the participation of 56 institutes in 21 countries (~ 4000 processors used in parallel). The basic elements of the ATLAS Monte Carlo production system are described. We also present how the software suite was validated and the participating sites were certified. These productions were already partly performed by using different flavours of Grid middleware at ~ 20 sites.cs.DC/0306052oai:cds.cern.ch:6215922003-06-12 |
spellingShingle | Computing and Computers Poulard, G ATLAS Data Challenge 1 |
title | ATLAS Data Challenge 1 |
title_full | ATLAS Data Challenge 1 |
title_fullStr | ATLAS Data Challenge 1 |
title_full_unstemmed | ATLAS Data Challenge 1 |
title_short | ATLAS Data Challenge 1 |
title_sort | atlas data challenge 1 |
topic | Computing and Computers |
url | http://cds.cern.ch/record/621592 |
work_keys_str_mv | AT poulardg atlasdatachallenge1 |