Cargando…

Data Logistics and the CMS Analysis Model

The Compact Muon Solenoid Experiment (CMS) at the Large Hadron Collider (LHC) at CERN has brilliant prospects for uncovering new information about the physical structure of our universe. Soon physicists around the world will participate together in analyzing CMS data in search of new physics phenome...

Descripción completa

Detalles Bibliográficos
Autor principal: Managan, Julie E
Lenguaje:eng
Publicado: U. 2009
Materias:
Acceso en línea:http://cds.cern.ch/record/1331888
_version_ 1780921755971354624
author Managan, Julie E
author_facet Managan, Julie E
author_sort Managan, Julie E
collection CERN
description The Compact Muon Solenoid Experiment (CMS) at the Large Hadron Collider (LHC) at CERN has brilliant prospects for uncovering new information about the physical structure of our universe. Soon physicists around the world will participate together in analyzing CMS data in search of new physics phenomena and the Higgs Boson. However, they face a significant problem: with 5 Petabytes of data needing distribution each year, how will physicists get the data they need? How and where will they be able to analyze it? Computing resources and scientists are scattered around the world, while CMS data exists in localized chunks. The CMS computing model only allows analysis of locally stored data, “tethering” analysis to storage. The Vanderbilt CMS team is actively working to solve this problem with the Research and Education Data Depot Network (REDDnet), a program run by Vanderbilt’s Advanced Computing Center for Research and Education (ACCRE). The Compact Muon Solenoid Experiment (CMS) at the Large Hadron Collider (LHC) at CERN has brilliant prospects for uncovering new information about the physical structure of our universe. Soon physicists around the world will participate together in analyzing CMS data in search of new physics phenomena and the Higgs Boson. However, they face a significant problem: with 5 Petabytes of data needing distribution each year, how will physicists get the data they need? How and where will they be able to analyze it? Computing resources and scientists are scattered around the world, while CMS data exists in localized chunks. The CMS computing model only allows analysis of locally stored data, “tethering” analysis to storage. The Vanderbilt CMS team is actively working to solve this problem with the Research and Education Data Depot Network (REDDnet), a program run by Vanderbilt’s Advanced Computing Center for Research and Education (ACCRE).
id cern-1331888
institution Organización Europea para la Investigación Nuclear
language eng
publishDate 2009
publisher U.
record_format invenio
spelling cern-13318882019-09-30T06:29:59Zhttp://cds.cern.ch/record/1331888engManagan, Julie EData Logistics and the CMS Analysis ModelComputing and ComputersThe Compact Muon Solenoid Experiment (CMS) at the Large Hadron Collider (LHC) at CERN has brilliant prospects for uncovering new information about the physical structure of our universe. Soon physicists around the world will participate together in analyzing CMS data in search of new physics phenomena and the Higgs Boson. However, they face a significant problem: with 5 Petabytes of data needing distribution each year, how will physicists get the data they need? How and where will they be able to analyze it? Computing resources and scientists are scattered around the world, while CMS data exists in localized chunks. The CMS computing model only allows analysis of locally stored data, “tethering” analysis to storage. The Vanderbilt CMS team is actively working to solve this problem with the Research and Education Data Depot Network (REDDnet), a program run by Vanderbilt’s Advanced Computing Center for Research and Education (ACCRE). The Compact Muon Solenoid Experiment (CMS) at the Large Hadron Collider (LHC) at CERN has brilliant prospects for uncovering new information about the physical structure of our universe. Soon physicists around the world will participate together in analyzing CMS data in search of new physics phenomena and the Higgs Boson. However, they face a significant problem: with 5 Petabytes of data needing distribution each year, how will physicists get the data they need? How and where will they be able to analyze it? Computing resources and scientists are scattered around the world, while CMS data exists in localized chunks. The CMS computing model only allows analysis of locally stored data, “tethering” analysis to storage. The Vanderbilt CMS team is actively working to solve this problem with the Research and Education Data Depot Network (REDDnet), a program run by Vanderbilt’s Advanced Computing Center for Research and Education (ACCRE).U.CERN-THESIS-2009-263oai:cds.cern.ch:13318882009
spellingShingle Computing and Computers
Managan, Julie E
Data Logistics and the CMS Analysis Model
title Data Logistics and the CMS Analysis Model
title_full Data Logistics and the CMS Analysis Model
title_fullStr Data Logistics and the CMS Analysis Model
title_full_unstemmed Data Logistics and the CMS Analysis Model
title_short Data Logistics and the CMS Analysis Model
title_sort data logistics and the cms analysis model
topic Computing and Computers
url http://cds.cern.ch/record/1331888
work_keys_str_mv AT managanjuliee datalogisticsandthecmsanalysismodel