Cargando…
Generative and discriminative training of Boltzmann machine through quantum annealing
A hybrid quantum-classical method for learning Boltzmann machines (BM) for a generative and discriminative task is presented. BM are undirected graphs with a network of visible and hidden nodes where the former is used as the reading site. In contrast, the latter is used to manipulate visible states...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10188519/ https://www.ncbi.nlm.nih.gov/pubmed/37193710 http://dx.doi.org/10.1038/s41598-023-34652-4 |
_version_ | 1785042931610025984 |
---|---|
author | Srivastava, Siddhartha Sundararaghavan, Veera |
author_facet | Srivastava, Siddhartha Sundararaghavan, Veera |
author_sort | Srivastava, Siddhartha |
collection | PubMed |
description | A hybrid quantum-classical method for learning Boltzmann machines (BM) for a generative and discriminative task is presented. BM are undirected graphs with a network of visible and hidden nodes where the former is used as the reading site. In contrast, the latter is used to manipulate visible states’ probability. In Generative BM, the samples of visible data imitate the probability distribution of a given data set. In contrast, the visible sites of discriminative BM are treated as Input/Output (I/O) reading sites where the conditional probability of output state is optimized for a given set of input states. The cost function for learning BM is defined as a weighted sum of Kullback-Leibler (KL) divergence and Negative conditional Log-likelihood (NCLL), adjusted using a hyper-parameter. Here, the KL Divergence is the cost for generative learning, and NCLL is the cost for discriminative learning. A Stochastic Newton-Raphson optimization scheme is presented. The gradients and the Hessians are approximated using direct samples of BM obtained through quantum annealing. Quantum annealers are hardware representing the physics of the Ising model that operates on low but finite temperatures. This temperature affects the probability distribution of the BM; however, its value is unknown. Previous efforts have focused on estimating this unknown temperature through regression of theoretical Boltzmann energies of sampled states with the probability of states sampled by the actual hardware. These approaches assume that the control parameter change does not affect the system temperature; however, this is usually untrue. Instead of using energies, the probability distribution of samples is employed to estimate the optimal parameter set, ensuring that the optimal set can be obtained from a single set of samples. The KL divergence and NCLL are optimized for the system temperature, and the result is used to rescale the control parameter set. The performance of this approach, as tested against the theoretically expected distributions, shows promising results for Boltzmann training on quantum annealers. |
format | Online Article Text |
id | pubmed-10188519 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | Nature Publishing Group UK |
record_format | MEDLINE/PubMed |
spelling | pubmed-101885192023-05-18 Generative and discriminative training of Boltzmann machine through quantum annealing Srivastava, Siddhartha Sundararaghavan, Veera Sci Rep Article A hybrid quantum-classical method for learning Boltzmann machines (BM) for a generative and discriminative task is presented. BM are undirected graphs with a network of visible and hidden nodes where the former is used as the reading site. In contrast, the latter is used to manipulate visible states’ probability. In Generative BM, the samples of visible data imitate the probability distribution of a given data set. In contrast, the visible sites of discriminative BM are treated as Input/Output (I/O) reading sites where the conditional probability of output state is optimized for a given set of input states. The cost function for learning BM is defined as a weighted sum of Kullback-Leibler (KL) divergence and Negative conditional Log-likelihood (NCLL), adjusted using a hyper-parameter. Here, the KL Divergence is the cost for generative learning, and NCLL is the cost for discriminative learning. A Stochastic Newton-Raphson optimization scheme is presented. The gradients and the Hessians are approximated using direct samples of BM obtained through quantum annealing. Quantum annealers are hardware representing the physics of the Ising model that operates on low but finite temperatures. This temperature affects the probability distribution of the BM; however, its value is unknown. Previous efforts have focused on estimating this unknown temperature through regression of theoretical Boltzmann energies of sampled states with the probability of states sampled by the actual hardware. These approaches assume that the control parameter change does not affect the system temperature; however, this is usually untrue. Instead of using energies, the probability distribution of samples is employed to estimate the optimal parameter set, ensuring that the optimal set can be obtained from a single set of samples. The KL divergence and NCLL are optimized for the system temperature, and the result is used to rescale the control parameter set. The performance of this approach, as tested against the theoretically expected distributions, shows promising results for Boltzmann training on quantum annealers. Nature Publishing Group UK 2023-05-16 /pmc/articles/PMC10188519/ /pubmed/37193710 http://dx.doi.org/10.1038/s41598-023-34652-4 Text en © The Author(s) 2023 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . |
spellingShingle | Article Srivastava, Siddhartha Sundararaghavan, Veera Generative and discriminative training of Boltzmann machine through quantum annealing |
title | Generative and discriminative training of Boltzmann machine through quantum annealing |
title_full | Generative and discriminative training of Boltzmann machine through quantum annealing |
title_fullStr | Generative and discriminative training of Boltzmann machine through quantum annealing |
title_full_unstemmed | Generative and discriminative training of Boltzmann machine through quantum annealing |
title_short | Generative and discriminative training of Boltzmann machine through quantum annealing |
title_sort | generative and discriminative training of boltzmann machine through quantum annealing |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10188519/ https://www.ncbi.nlm.nih.gov/pubmed/37193710 http://dx.doi.org/10.1038/s41598-023-34652-4 |
work_keys_str_mv | AT srivastavasiddhartha generativeanddiscriminativetrainingofboltzmannmachinethroughquantumannealing AT sundararaghavanveera generativeanddiscriminativetrainingofboltzmannmachinethroughquantumannealing |