Cargando…

CMS experience with online and offline Databases

The CMS experiment is made of many detectors which in total sum up to more than 75 million channels. The online database stores the configuration data used to configure the various parts of the detector and bring it in all possible running states. The database also stores the conditions data, de...

Descripción completa

Detalles Bibliográficos
Autor principal: Pfeiffer, Andreas
Lenguaje:eng
Publicado: 2012
Materias:
Acceso en línea:https://dx.doi.org/10.1088/1742-6596/396/5/052059
http://cds.cern.ch/record/1457655
_version_ 1780925132685967360
author Pfeiffer, Andreas
author_facet Pfeiffer, Andreas
author_sort Pfeiffer, Andreas
collection CERN
description The CMS experiment is made of many detectors which in total sum up to more than 75 million channels. The online database stores the configuration data used to configure the various parts of the detector and bring it in all possible running states. The database also stores the conditions data, detector monitoring parameters of all channels (temperatures, voltages), detector quality information, beam conditions, etc. These quantities are used by the experts to monitor the detector performance in detail, as they occupy a very large space in the online database they cannot be used as-is for offline data reconstruction. For this, a "condensed" set of the full information, the "conditions data", is created and copied to a separate database used in the offline reconstruction. The offline conditions database contains the alignment and calibrations data for the various detectors. Conditions data sets are accessed by a tag and an interval of validity through the offline reconstruction program CMSSW, written in C++. Performant access to the conditions data as C++ objects is a key requirement for the reconstruction and data analysis. About 200 types of calibration and alignment exist for the various CMS sub-detectors. Only those data which are crucial for reconstruction are inserted into the offline conditions DB. This guarantees a fast access to conditions during reconstruction and a small size of the conditions DB. Calibration and alignment data are fundamental to maintain the design performance of the experiment. Very fast workflows have been put in place to compute and validate the alignment and calibration sets and insert them in the conditions database before the reconstruction process starts. Some of these sets are produced analyzing and summarizing the parameters stored in the online database. Others are computed using event data through a special express workflow. A dedicated monitoring system has been put up to monitor these time-critical processes. The talk describes the experience with the CMS online and offline databases during the 2010 and 2011 data taking periods, showing some of the issues found and lessons learned.
id cern-1457655
institution Organización Europea para la Investigación Nuclear
language eng
publishDate 2012
record_format invenio
spelling cern-14576552019-09-30T06:29:59Zdoi:10.1088/1742-6596/396/5/052059http://cds.cern.ch/record/1457655engPfeiffer, AndreasCMS experience with online and offline DatabasesDetectors and Experimental TechniquesThe CMS experiment is made of many detectors which in total sum up to more than 75 million channels. The online database stores the configuration data used to configure the various parts of the detector and bring it in all possible running states. The database also stores the conditions data, detector monitoring parameters of all channels (temperatures, voltages), detector quality information, beam conditions, etc. These quantities are used by the experts to monitor the detector performance in detail, as they occupy a very large space in the online database they cannot be used as-is for offline data reconstruction. For this, a "condensed" set of the full information, the "conditions data", is created and copied to a separate database used in the offline reconstruction. The offline conditions database contains the alignment and calibrations data for the various detectors. Conditions data sets are accessed by a tag and an interval of validity through the offline reconstruction program CMSSW, written in C++. Performant access to the conditions data as C++ objects is a key requirement for the reconstruction and data analysis. About 200 types of calibration and alignment exist for the various CMS sub-detectors. Only those data which are crucial for reconstruction are inserted into the offline conditions DB. This guarantees a fast access to conditions during reconstruction and a small size of the conditions DB. Calibration and alignment data are fundamental to maintain the design performance of the experiment. Very fast workflows have been put in place to compute and validate the alignment and calibration sets and insert them in the conditions database before the reconstruction process starts. Some of these sets are produced analyzing and summarizing the parameters stored in the online database. Others are computed using event data through a special express workflow. A dedicated monitoring system has been put up to monitor these time-critical processes. The talk describes the experience with the CMS online and offline databases during the 2010 and 2011 data taking periods, showing some of the issues found and lessons learned.CMS-CR-2012-094oai:cds.cern.ch:14576552012-05-14
spellingShingle Detectors and Experimental Techniques
Pfeiffer, Andreas
CMS experience with online and offline Databases
title CMS experience with online and offline Databases
title_full CMS experience with online and offline Databases
title_fullStr CMS experience with online and offline Databases
title_full_unstemmed CMS experience with online and offline Databases
title_short CMS experience with online and offline Databases
title_sort cms experience with online and offline databases
topic Detectors and Experimental Techniques
url https://dx.doi.org/10.1088/1742-6596/396/5/052059
http://cds.cern.ch/record/1457655
work_keys_str_mv AT pfeifferandreas cmsexperiencewithonlineandofflinedatabases