Cargando…

Symbolic Regression on FPGAs for Fast Machine Learning Inference

The high-energy physics community is investigating the feasibility of deploying machine-learning-based solutions on Field-Programmable Gate Arrays (FPGAs) to improve physics sensitivity while meeting data processing latency limitations. In this contribution, we introduce a novel end-to-end procedure...

Descripción completa

Detalles Bibliográficos
Autores principales: Tsoi, Ho Fung, Pol, Adrian Alan, Loncar, Vladimir, Govorkova, Ekaterina, Cranmer, Miles, Dasu, Sridhara, Elmer, Peter, Harris, Philip, Ojalvo, Isobel, Pierini, Maurizio
Lenguaje:eng
Publicado: 2023
Materias:
Acceso en línea:http://cds.cern.ch/record/2858515
_version_ 1780977666795503616
author Tsoi, Ho Fung
Pol, Adrian Alan
Loncar, Vladimir
Govorkova, Ekaterina
Cranmer, Miles
Dasu, Sridhara
Elmer, Peter
Harris, Philip
Ojalvo, Isobel
Pierini, Maurizio
author_facet Tsoi, Ho Fung
Pol, Adrian Alan
Loncar, Vladimir
Govorkova, Ekaterina
Cranmer, Miles
Dasu, Sridhara
Elmer, Peter
Harris, Philip
Ojalvo, Isobel
Pierini, Maurizio
author_sort Tsoi, Ho Fung
collection CERN
description The high-energy physics community is investigating the feasibility of deploying machine-learning-based solutions on Field-Programmable Gate Arrays (FPGAs) to improve physics sensitivity while meeting data processing latency limitations. In this contribution, we introduce a novel end-to-end procedure that utilizes a machine learning technique called symbolic regression (SR). It searches equation space to discover algebraic relations approximating a dataset. We use PySR (software for uncovering these expressions based on evolutionary algorithm) and extend the functionality of hls4ml (a package for machine learning inference in FPGAs) to support PySR-generated expressions for resource-constrained production environments. Deep learning models often optimise the top metric by pinning the network size because vast hyperparameter space prevents extensive neural architecture search. Conversely, SR selects a set of models on the Pareto front, which allows for optimising the performance-resource tradeoff directly. By embedding symbolic forms, our implementation can dramatically reduce the computational resources needed to perform critical tasks. We validate our procedure on a physics benchmark: multiclass classification of jets produced in simulated proton-proton collisions at the CERN Large Hadron Collider, and show that we approximate a 3-layer neural network with an inference model that has as low as 5 ns execution time (a reduction by a factor of 13) and over 90% approximation accuracy.
id cern-2858515
institution Organización Europea para la Investigación Nuclear
language eng
publishDate 2023
record_format invenio
spelling cern-28585152023-06-05T12:21:48Zhttp://cds.cern.ch/record/2858515engTsoi, Ho FungPol, Adrian AlanLoncar, VladimirGovorkova, EkaterinaCranmer, MilesDasu, SridharaElmer, PeterHarris, PhilipOjalvo, IsobelPierini, MaurizioSymbolic Regression on FPGAs for Fast Machine Learning Inferencephysics.ins-detDetectors and Experimental Techniqueshep-exParticle Physics - Experimentcs.LGComputing and ComputersThe high-energy physics community is investigating the feasibility of deploying machine-learning-based solutions on Field-Programmable Gate Arrays (FPGAs) to improve physics sensitivity while meeting data processing latency limitations. In this contribution, we introduce a novel end-to-end procedure that utilizes a machine learning technique called symbolic regression (SR). It searches equation space to discover algebraic relations approximating a dataset. We use PySR (software for uncovering these expressions based on evolutionary algorithm) and extend the functionality of hls4ml (a package for machine learning inference in FPGAs) to support PySR-generated expressions for resource-constrained production environments. Deep learning models often optimise the top metric by pinning the network size because vast hyperparameter space prevents extensive neural architecture search. Conversely, SR selects a set of models on the Pareto front, which allows for optimising the performance-resource tradeoff directly. By embedding symbolic forms, our implementation can dramatically reduce the computational resources needed to perform critical tasks. We validate our procedure on a physics benchmark: multiclass classification of jets produced in simulated proton-proton collisions at the CERN Large Hadron Collider, and show that we approximate a 3-layer neural network with an inference model that has as low as 5 ns execution time (a reduction by a factor of 13) and over 90% approximation accuracy.arXiv:2305.04099oai:cds.cern.ch:28585152023-05-06
spellingShingle physics.ins-det
Detectors and Experimental Techniques
hep-ex
Particle Physics - Experiment
cs.LG
Computing and Computers
Tsoi, Ho Fung
Pol, Adrian Alan
Loncar, Vladimir
Govorkova, Ekaterina
Cranmer, Miles
Dasu, Sridhara
Elmer, Peter
Harris, Philip
Ojalvo, Isobel
Pierini, Maurizio
Symbolic Regression on FPGAs for Fast Machine Learning Inference
title Symbolic Regression on FPGAs for Fast Machine Learning Inference
title_full Symbolic Regression on FPGAs for Fast Machine Learning Inference
title_fullStr Symbolic Regression on FPGAs for Fast Machine Learning Inference
title_full_unstemmed Symbolic Regression on FPGAs for Fast Machine Learning Inference
title_short Symbolic Regression on FPGAs for Fast Machine Learning Inference
title_sort symbolic regression on fpgas for fast machine learning inference
topic physics.ins-det
Detectors and Experimental Techniques
hep-ex
Particle Physics - Experiment
cs.LG
Computing and Computers
url http://cds.cern.ch/record/2858515
work_keys_str_mv AT tsoihofung symbolicregressiononfpgasforfastmachinelearninginference
AT poladrianalan symbolicregressiononfpgasforfastmachinelearninginference
AT loncarvladimir symbolicregressiononfpgasforfastmachinelearninginference
AT govorkovaekaterina symbolicregressiononfpgasforfastmachinelearninginference
AT cranmermiles symbolicregressiononfpgasforfastmachinelearninginference
AT dasusridhara symbolicregressiononfpgasforfastmachinelearninginference
AT elmerpeter symbolicregressiononfpgasforfastmachinelearninginference
AT harrisphilip symbolicregressiononfpgasforfastmachinelearninginference
AT ojalvoisobel symbolicregressiononfpgasforfastmachinelearninginference
AT pierinimaurizio symbolicregressiononfpgasforfastmachinelearninginference