Cargando…

CMS Grid Computing and Monitoring of CMS Data Quality Using Selected Data Samples

The Large Hadron Collider (LHC) at CERN, the European Center for Particle Physics in Geneva, is a proton-proton collider, designed to operate at a center-of-mass energy of 14 TeV at a nominal luminosity of 10 34 cm2 s 1 The main physics goals of the LHC experiments are the search for . the Standard...

Descripción completa

Detalles Bibliográficos
Autor principal: Chen, Zhilin
Lenguaje:eng
Publicado: ETH 2011
Materias:
Acceso en línea:https://dx.doi.org/10.3929/ethz-a-006444689
http://cds.cern.ch/record/1354249
_version_ 1780922376450473984
author Chen, Zhilin
author_facet Chen, Zhilin
author_sort Chen, Zhilin
collection CERN
description The Large Hadron Collider (LHC) at CERN, the European Center for Particle Physics in Geneva, is a proton-proton collider, designed to operate at a center-of-mass energy of 14 TeV at a nominal luminosity of 10 34 cm2 s 1 The main physics goals of the LHC experiments are the search for . the Standard Model Higgs boson and new physics phenomena beyond the Standard Model. The design and construction of the Compact Muon Solenoid (CMS) experiment at the LHC had to meet unprecedented challenges both for the detector operation as well as for the data handling. Due to the high event rate and large event size, the LHC experiments generate large amount of data of about 15 petabyte (10 15 bytes) per year at the design luminosity, which thousands of scientists at hundreds of research institutes and universities around the world access and analyse. In addition, detailed Monte Carlo simulations of various physics processes also require large-scale computing power and huge amount of mass storage. To meet these requirements, a novel globally distributed model for data storage and CPU power was chosen: the Worldwide LHC Computing Grid (WLCG). The WLCG collaboration in Switzerland provides computing infrastructure and resources to physicists from Swiss institutions involved in the LHC experiments as well as to the experimental collaborations, by operating a high-performance Tier-2 center at CSCS in Manno and the Swiss CMS Tier-3 center at the Paul Scherrer Institute (PSI) in Villigen. This thesis reports on my work for the Swiss Tier-2 and the CMS Tier-3 centers. The two facilities passed several benchmarks, were upgraded continuously over the past years and show excellent operation performance since the start-up of the LHC on 30 March 2010, providing proton-proton collisions at a center-of-mass energy of 7 TeV. In the second part of this thesis studies of the detector performance and data quality monitoring are described, which are key issues for the physics output, especially at the start of data taking. A data sample of Z ! e + e recorded up to September 2010 is selected to study the performance of the CMS detector and to monitor the data quality. Electrons and positrons are reconstructed and identified in the electromagnetic calorimeter requiring a matching track in the tracking system. The measured invariant mass distribution obtained from the selected electron-positron pairs show a clear Z mass peak with very little background. This result is in good agreement with the Monte Carlo predictions and illustrates the good data quality at the start of the CMS operation.
id cern-1354249
institution Organización Europea para la Investigación Nuclear
language eng
publishDate 2011
publisher ETH
record_format invenio
spelling cern-13542492019-09-30T06:29:59Zdoi:10.3929/ethz-a-006444689http://cds.cern.ch/record/1354249engChen, ZhilinCMS Grid Computing and Monitoring of CMS Data Quality Using Selected Data SamplesComputing and ComputersDetectors and Experimental TechniquesThe Large Hadron Collider (LHC) at CERN, the European Center for Particle Physics in Geneva, is a proton-proton collider, designed to operate at a center-of-mass energy of 14 TeV at a nominal luminosity of 10 34 cm2 s 1 The main physics goals of the LHC experiments are the search for . the Standard Model Higgs boson and new physics phenomena beyond the Standard Model. The design and construction of the Compact Muon Solenoid (CMS) experiment at the LHC had to meet unprecedented challenges both for the detector operation as well as for the data handling. Due to the high event rate and large event size, the LHC experiments generate large amount of data of about 15 petabyte (10 15 bytes) per year at the design luminosity, which thousands of scientists at hundreds of research institutes and universities around the world access and analyse. In addition, detailed Monte Carlo simulations of various physics processes also require large-scale computing power and huge amount of mass storage. To meet these requirements, a novel globally distributed model for data storage and CPU power was chosen: the Worldwide LHC Computing Grid (WLCG). The WLCG collaboration in Switzerland provides computing infrastructure and resources to physicists from Swiss institutions involved in the LHC experiments as well as to the experimental collaborations, by operating a high-performance Tier-2 center at CSCS in Manno and the Swiss CMS Tier-3 center at the Paul Scherrer Institute (PSI) in Villigen. This thesis reports on my work for the Swiss Tier-2 and the CMS Tier-3 centers. The two facilities passed several benchmarks, were upgraded continuously over the past years and show excellent operation performance since the start-up of the LHC on 30 March 2010, providing proton-proton collisions at a center-of-mass energy of 7 TeV. In the second part of this thesis studies of the detector performance and data quality monitoring are described, which are key issues for the physics output, especially at the start of data taking. A data sample of Z ! e + e recorded up to September 2010 is selected to study the performance of the CMS detector and to monitor the data quality. Electrons and positrons are reconstructed and identified in the electromagnetic calorimeter requiring a matching track in the tracking system. The measured invariant mass distribution obtained from the selected electron-positron pairs show a clear Z mass peak with very little background. This result is in good agreement with the Monte Carlo predictions and illustrates the good data quality at the start of the CMS operation.ETHCERN-THESIS-2011-021ETH-19429oai:cds.cern.ch:13542492011
spellingShingle Computing and Computers
Detectors and Experimental Techniques
Chen, Zhilin
CMS Grid Computing and Monitoring of CMS Data Quality Using Selected Data Samples
title CMS Grid Computing and Monitoring of CMS Data Quality Using Selected Data Samples
title_full CMS Grid Computing and Monitoring of CMS Data Quality Using Selected Data Samples
title_fullStr CMS Grid Computing and Monitoring of CMS Data Quality Using Selected Data Samples
title_full_unstemmed CMS Grid Computing and Monitoring of CMS Data Quality Using Selected Data Samples
title_short CMS Grid Computing and Monitoring of CMS Data Quality Using Selected Data Samples
title_sort cms grid computing and monitoring of cms data quality using selected data samples
topic Computing and Computers
Detectors and Experimental Techniques
url https://dx.doi.org/10.3929/ethz-a-006444689
http://cds.cern.ch/record/1354249
work_keys_str_mv AT chenzhilin cmsgridcomputingandmonitoringofcmsdataqualityusingselecteddatasamples