Cargando…
Application of Grid technologies and search for exotics physics with the ATLAS experiment at the LHC
The work presented in this thesis has been performed within the ATLAS (A Toroidal LHC ApparatuS) collaboration. Two subjects have been investigated. One subject is the Computing System Commissioning (CSC) production using an instance of the Production System (ProdSys), called Lexor, and the test of...
Autor principal: | |
---|---|
Lenguaje: | eng |
Publicado: |
2013
|
Materias: | |
Acceso en línea: | http://cds.cern.ch/record/1518755 |
Sumario: | The work presented in this thesis has been performed within the ATLAS (A Toroidal LHC ApparatuS) collaboration. Two subjects have been investigated. One subject is the Computing System Commissioning (CSC) production using an instance of the Production System (ProdSys), called Lexor, and the test of the ATLAS Distributed Analysis (ADA) using ProdSys. The other subject is the sim- ulation and subsequent analysis of processes involving new particles predicted by the Little Higgs model within the ATLAS detector. An introduction to the Standard Model (SM), the Large Hadron Collider (LHC) and the ATLAS experiment, software and computing is given in chapter 1. The problems of the SM are discussed and some proposed solutions are reviewed. The SM introduction is followed by an overview of LHC and ATLAS. The main AT- LAS subsystems are described and the ATLAS software and computing model is discussed. Many physics processes within and beyond the Standard Model involve b-quark decays. New heavy particles, expected in models like the Little Higgs model, produce high pT b-jets. For this reason, a detailed b-tagging study of high and veryhigh-pT jets,usingtheATLASfulldetectorsimulation,ispresentedinchapter 2. The Little Higgs model is a novel theory introduced to solve the hierarchy problem of the Standard Model without Supersymmetry (SUSY). In this model, new heavy particles cancel the radiative corrections of the Standard Model to the Higgs mass, allowing a low mass Higgs boson. These new particles are within the reach of the LHC. The simplest model based on the Little Higgs ideas is the so-called Littlest Higgs model. This model has been used as reference for the simulation studies presented in this thesis. In chapter 3, the principles and phenomenology of this model are discussed. The possibility to discover the new predicted particles using the ATLAS detector has been investigated. More precisely, the hadronic decay channels of heavy gauge bosons are analysed using the ATLAS fast detector simulation (ATLFAST). The ATLAS collaboration is preparing for data taking, scheduled to start in 2008. After data collection by the online trigger, the expected data volume recorded for offline reconstruction and analysis is of the order of few Petabytes per year, to be processed by institutions distributed worldwide. Therefore the required computing resources are two orders of magnitude larger than for previous experiments in particle physics. The LHC experiments need a worldwide distributed data man- agement and computing system to handle this huge amount of data. The LHC Computing Review in 2001 recommended that LHC experiments should carry out Data Challenges (DCs) of increasing size and complexity to be prepared for LHC data taking. The goal of the ATLAS Data Challenges is the validation of the ATLAS computing model. A description of the ATLAS Production System, that allows to manage massive productions of Monte Carlo data using Grid resources, and the main results obtained for ATLAS DCs are discussed in chapter 4. A description of the IFIC contribution to the ATLAS CSC simulation production running an ATLAS ProdSys instance is given also. Finally, as part of the preparation for data taking and physics analysis, the ATLAS Distributed Analysis model and tools are described in chapter 5, including an ex- ample of application using the ADA ProdSys. A description of a Tier-3 prototype for IFIC is presented as well. |
---|