Cargando…

Generalization in quantum machine learning from few training data

Modern quantum machine learning (QML) methods involve variationally optimizing a parameterized quantum circuit on a training data set, and subsequently making predictions on a testing data set (i.e., generalizing). In this work, we provide a comprehensive study of generalization performance in QML a...

Descripción completa

Detalles Bibliográficos
Autores principales: Caro, Matthias C., Huang, Hsin-Yuan, Cerezo, M., Sharma, Kunal, Sornborger, Andrew, Cincio, Lukasz, Coles, Patrick J.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9395350/
https://www.ncbi.nlm.nih.gov/pubmed/35995777
http://dx.doi.org/10.1038/s41467-022-32550-3
_version_ 1784771672241340416
author Caro, Matthias C.
Huang, Hsin-Yuan
Cerezo, M.
Sharma, Kunal
Sornborger, Andrew
Cincio, Lukasz
Coles, Patrick J.
author_facet Caro, Matthias C.
Huang, Hsin-Yuan
Cerezo, M.
Sharma, Kunal
Sornborger, Andrew
Cincio, Lukasz
Coles, Patrick J.
author_sort Caro, Matthias C.
collection PubMed
description Modern quantum machine learning (QML) methods involve variationally optimizing a parameterized quantum circuit on a training data set, and subsequently making predictions on a testing data set (i.e., generalizing). In this work, we provide a comprehensive study of generalization performance in QML after training on a limited number N of training data points. We show that the generalization error of a quantum machine learning model with T trainable gates scales at worst as [Formula: see text] . When only K ≪ T gates have undergone substantial change in the optimization process, we prove that the generalization error improves to [Formula: see text] . Our results imply that the compiling of unitaries into a polynomial number of native gates, a crucial application for the quantum computing industry that typically uses exponential-size training data, can be sped up significantly. We also show that classification of quantum states across a phase transition with a quantum convolutional neural network requires only a very small training data set. Other potential applications include learning quantum error correcting codes or quantum dynamical simulation. Our work injects new hope into the field of QML, as good generalization is guaranteed from few training data.
format Online
Article
Text
id pubmed-9395350
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-93953502022-08-24 Generalization in quantum machine learning from few training data Caro, Matthias C. Huang, Hsin-Yuan Cerezo, M. Sharma, Kunal Sornborger, Andrew Cincio, Lukasz Coles, Patrick J. Nat Commun Article Modern quantum machine learning (QML) methods involve variationally optimizing a parameterized quantum circuit on a training data set, and subsequently making predictions on a testing data set (i.e., generalizing). In this work, we provide a comprehensive study of generalization performance in QML after training on a limited number N of training data points. We show that the generalization error of a quantum machine learning model with T trainable gates scales at worst as [Formula: see text] . When only K ≪ T gates have undergone substantial change in the optimization process, we prove that the generalization error improves to [Formula: see text] . Our results imply that the compiling of unitaries into a polynomial number of native gates, a crucial application for the quantum computing industry that typically uses exponential-size training data, can be sped up significantly. We also show that classification of quantum states across a phase transition with a quantum convolutional neural network requires only a very small training data set. Other potential applications include learning quantum error correcting codes or quantum dynamical simulation. Our work injects new hope into the field of QML, as good generalization is guaranteed from few training data. Nature Publishing Group UK 2022-08-22 /pmc/articles/PMC9395350/ /pubmed/35995777 http://dx.doi.org/10.1038/s41467-022-32550-3 Text en © The Author(s) 2022, corrected publication 2022 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Caro, Matthias C.
Huang, Hsin-Yuan
Cerezo, M.
Sharma, Kunal
Sornborger, Andrew
Cincio, Lukasz
Coles, Patrick J.
Generalization in quantum machine learning from few training data
title Generalization in quantum machine learning from few training data
title_full Generalization in quantum machine learning from few training data
title_fullStr Generalization in quantum machine learning from few training data
title_full_unstemmed Generalization in quantum machine learning from few training data
title_short Generalization in quantum machine learning from few training data
title_sort generalization in quantum machine learning from few training data
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9395350/
https://www.ncbi.nlm.nih.gov/pubmed/35995777
http://dx.doi.org/10.1038/s41467-022-32550-3
work_keys_str_mv AT caromatthiasc generalizationinquantummachinelearningfromfewtrainingdata
AT huanghsinyuan generalizationinquantummachinelearningfromfewtrainingdata
AT cerezom generalizationinquantummachinelearningfromfewtrainingdata
AT sharmakunal generalizationinquantummachinelearningfromfewtrainingdata
AT sornborgerandrew generalizationinquantummachinelearningfromfewtrainingdata
AT cinciolukasz generalizationinquantummachinelearningfromfewtrainingdata
AT colespatrickj generalizationinquantummachinelearningfromfewtrainingdata