Cargando…
Computing needs of LHC experiments and prospects for a world wide grid for particle physics
Several new experiments in particle physics are being prepared by large international consortia. They will generate data at the rate of 100-200 MB/s over a number of years, which will result in many PB (10 /sup 15/ B) of information. This data will have to be made accessible to a large, internationa...
Autores principales: | , , , , , |
---|---|
Lenguaje: | eng |
Publicado: |
2005
|
Materias: | |
Acceso en línea: | http://cds.cern.ch/record/912253 |
Sumario: | Several new experiments in particle physics are being prepared by large international consortia. They will generate data at the rate of 100-200 MB/s over a number of years, which will result in many PB (10 /sup 15/ B) of information. This data will have to be made accessible to a large, international community of researchers and, as such, it calls for a new approach to the problem of data analysis. Estimates of the computing needs of future experiments, as well as scenarios for overcoming potential difficulties, are presented based on the studies conducted by Large Hadron Collider (LHC) consortia and consortia and computing projects based on distributed resources and data (Grid). Short information on the operation of the LHC Computing Grid project is provided, together with a description of the first installations. Examples of large-scale Monte Carlo simulations are also given. |
---|