Cargando…

Overall quality optimization for DQM stage in High Energy Physics experiments

Data Acquisition (DAQ) and Data Quality Monitoring (DQM) are key parts in the HEP data chain, where the data are processed and analyzed to obtain accurate monitoring quality indicators. Such stages are complex, including an intense processing work-flow and requiring a high degree of interoperability...

Descripción completa

Detalles Bibliográficos
Autores principales: Benekos, N, Parra-Royon, M, Benitez, J M
Lenguaje:eng
Publicado: IOP 2020
Materias:
Acceso en línea:https://dx.doi.org/10.1088/1742-6596/1525/1/012063
http://cds.cern.ch/record/2725601
Descripción
Sumario:Data Acquisition (DAQ) and Data Quality Monitoring (DQM) are key parts in the HEP data chain, where the data are processed and analyzed to obtain accurate monitoring quality indicators. Such stages are complex, including an intense processing work-flow and requiring a high degree of interoperability between software and hardware facilities. Data recorded by DAQ sensors and devices are sampled to perform live (and offline) DQM of the status of the detector during data collection providing to the system and scientists the ability to identify problems with extremely low latency, minimizing the amount of data that would otherwise be unsuitable for physical analysis. DQM stage performs a large set of operations (Fast Fourier Transform (FFT), clustering, classification algorithms, Region of Interest, particles tracking, etc.) involving the use of computing resources and time, depending on the number of events of the experiment, sampling data, complexity of the tasks or the quality performance. The objective of our work is to show a proposal with aim of developing a general optimization of the DQM stage considering all these elements. Techniques based on computational intelligence like EA can help improve the performance and therefore achieve an optimization of task scheduling in DQM.