Cargando…
Implementation of a model-independent search for new physics with the CMS detector exploiting the world-wide LHC Computing Grid
With this year's start of CERN's Large Hadron Collider (LHC) it will be possible for the first time to directly probe the physics at the TeV-scale at a collider experiment. At this scale the Standard Model of particle physics will reach its limits and new physical phenomena are expected to...
Autor principal: | |
---|---|
Lenguaje: | eng |
Publicado: |
2010
|
Materias: | |
Acceso en línea: | http://cds.cern.ch/record/1263613 |
Sumario: | With this year's start of CERN's Large Hadron Collider (LHC) it will be possible for the first time to directly probe the physics at the TeV-scale at a collider experiment. At this scale the Standard Model of particle physics will reach its limits and new physical phenomena are expected to appear. This study performed with one of the LHC's experiments, namely the Compact Muon Solenoid (CMS), is trying to quantify the understanding of the Standard Model and is hunting for deviations from the expectation by investigating a large fraction of the CMS data. While the classical approach for searches of physics beyond the Standard Model assumes a specific theoretical model and tries to isolate events with a certain signature characteristic for the new theory, this thesis follows a model-independent approach. The method relies only on the knowledge of the Standard Model and is suitable to spot deviations from this model induced by particular theoretical models but also theories not yet thought of. Future data are to be compared to the expectation in several hundreds of final state topologies and a few variables of general sensitivity to deviations like invariant masses. Within this feasibility study, events are classified according to their particle content (muons, electrons, photons, jets, missing energy) into so called event classes. A broad data scan is performed by investigating distributions searching for significant deviations fr om the Standard Model. Systematic uncertainties are rigourously taken into account within the analysis. Several theoretical models such as supersymmetry, new heavy gauge bosons and microscopic black holes as well as possible detector effects in the early data have been fed into the search algorithm as benchmark scenarios and proof the ability to supplement the traditional model-driven searches. Due to the enormous computing resource required for such an analysis performing a multitude of classical analyses in parallel the approach would not be feasible without the increasing performance and decreasing costs of modern computing systems. The LHC and its experiments with expected data rates of several 10~PetaBytes per year face this challenge with a distributed, locally organized computing and storage network: the LHC Computing Grid. The CMS tools embedded in such an environment and its application are demonstrated within this work. |
---|