Cargando…

The ATLAS Trigger and Data Acquisition Upgrades for the High-Luminosity LHC (HL-LHC)

The ATLAS experiment at CERN has started the construction of upgrades for the High Luminosity LHC (HL-LHC), with collisions due to start in 2026. In order to deliver an order of magnitude more data than previous LHC runs, 7 TeV protons will collide at 14 TeV with an instantaneous luminosity of up to...

Descripción completa

Detalles Bibliográficos
Autor principal: Valente, Marco
Lenguaje:eng
Publicado: SISSA 2019
Materias:
Acceso en línea:https://dx.doi.org/10.22323/1.364.0184
http://cds.cern.ch/record/2692161
_version_ 1780963929737920512
author Valente, Marco
author_facet Valente, Marco
author_sort Valente, Marco
collection CERN
description The ATLAS experiment at CERN has started the construction of upgrades for the High Luminosity LHC (HL-LHC), with collisions due to start in 2026. In order to deliver an order of magnitude more data than previous LHC runs, 7 TeV protons will collide at 14 TeV with an instantaneous luminosity of up to $7.5 \cdot 10^{34}\;\text{cm}^{-2}\text{s}^{-1}$, resulting in much higher pileup and data rates than the current experiment was designed to handle. While this is essential to realise the physics program, it presents a huge challenge for the detector, trigger, data acquisition and computing. The detector upgrades themselves present new requirements and opportunities for the trigger and data acquisition system. The approved baseline design of the TDAQ upgrade comprises: a hardware-based low-latency real-time trigger operating at 40 MHz, data acquisition which combines custom readout with commodity hardware and networking to deal with 5.2 TB/s input, and an event filter running at 1 MHz, which combines offline-like algorithms on a large set of commodity servers and hardware tracking. Commodity servers and networks are used, with custom ATCA boards, high speed links and powerful FPGAs deployed in the low-latency parts of the system. Offline-style clustering and jet-finding in FPGAs, and track reconstruction with Associative Memory ASICs and FPGAs are designed to combat pileup in the hardware trigger and the event filter respectively. This document reports recent progress on the design of the system and the performance on key physics processes.
id cern-2692161
institution Organización Europea para la Investigación Nuclear
language eng
publishDate 2019
publisher SISSA
record_format invenio
spelling cern-26921612022-10-12T19:16:16Zdoi:10.22323/1.364.0184http://cds.cern.ch/record/2692161engValente, MarcoThe ATLAS Trigger and Data Acquisition Upgrades for the High-Luminosity LHC (HL-LHC)Particle Physics - ExperimentDetectors and Experimental TechniquesThe ATLAS experiment at CERN has started the construction of upgrades for the High Luminosity LHC (HL-LHC), with collisions due to start in 2026. In order to deliver an order of magnitude more data than previous LHC runs, 7 TeV protons will collide at 14 TeV with an instantaneous luminosity of up to $7.5 \cdot 10^{34}\;\text{cm}^{-2}\text{s}^{-1}$, resulting in much higher pileup and data rates than the current experiment was designed to handle. While this is essential to realise the physics program, it presents a huge challenge for the detector, trigger, data acquisition and computing. The detector upgrades themselves present new requirements and opportunities for the trigger and data acquisition system. The approved baseline design of the TDAQ upgrade comprises: a hardware-based low-latency real-time trigger operating at 40 MHz, data acquisition which combines custom readout with commodity hardware and networking to deal with 5.2 TB/s input, and an event filter running at 1 MHz, which combines offline-like algorithms on a large set of commodity servers and hardware tracking. Commodity servers and networks are used, with custom ATCA boards, high speed links and powerful FPGAs deployed in the low-latency parts of the system. Offline-style clustering and jet-finding in FPGAs, and track reconstruction with Associative Memory ASICs and FPGAs are designed to combat pileup in the hardware trigger and the event filter respectively. This document reports recent progress on the design of the system and the performance on key physics processes.SISSAATL-DAQ-PROC-2019-020oai:cds.cern.ch:26921612019-10-05
spellingShingle Particle Physics - Experiment
Detectors and Experimental Techniques
Valente, Marco
The ATLAS Trigger and Data Acquisition Upgrades for the High-Luminosity LHC (HL-LHC)
title The ATLAS Trigger and Data Acquisition Upgrades for the High-Luminosity LHC (HL-LHC)
title_full The ATLAS Trigger and Data Acquisition Upgrades for the High-Luminosity LHC (HL-LHC)
title_fullStr The ATLAS Trigger and Data Acquisition Upgrades for the High-Luminosity LHC (HL-LHC)
title_full_unstemmed The ATLAS Trigger and Data Acquisition Upgrades for the High-Luminosity LHC (HL-LHC)
title_short The ATLAS Trigger and Data Acquisition Upgrades for the High-Luminosity LHC (HL-LHC)
title_sort atlas trigger and data acquisition upgrades for the high-luminosity lhc (hl-lhc)
topic Particle Physics - Experiment
Detectors and Experimental Techniques
url https://dx.doi.org/10.22323/1.364.0184
http://cds.cern.ch/record/2692161
work_keys_str_mv AT valentemarco theatlastriggeranddataacquisitionupgradesforthehighluminositylhchllhc
AT valentemarco atlastriggeranddataacquisitionupgradesforthehighluminositylhchllhc