Cargando…

Variational Hybrid Monte Carlo for Efficient Multi-Modal Data Sampling

The Hamiltonian Monte Carlo (HMC) sampling algorithm exploits Hamiltonian dynamics to construct efficient Markov Chain Monte Carlo (MCMC), which has become increasingly popular in machine learning and statistics. Since HMC uses the gradient information of the target distribution, it can explore the...

Descripción completa

Detalles Bibliográficos
Autores principales: Sun, Shiliang, Zhao, Jing, Gu, Minghao, Wang, Shanhu
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10138141/
https://www.ncbi.nlm.nih.gov/pubmed/37190347
http://dx.doi.org/10.3390/e25040560
_version_ 1785032636502114304
author Sun, Shiliang
Zhao, Jing
Gu, Minghao
Wang, Shanhu
author_facet Sun, Shiliang
Zhao, Jing
Gu, Minghao
Wang, Shanhu
author_sort Sun, Shiliang
collection PubMed
description The Hamiltonian Monte Carlo (HMC) sampling algorithm exploits Hamiltonian dynamics to construct efficient Markov Chain Monte Carlo (MCMC), which has become increasingly popular in machine learning and statistics. Since HMC uses the gradient information of the target distribution, it can explore the state space much more efficiently than random-walk proposals, but may suffer from high autocorrelation. In this paper, we propose Langevin Hamiltonian Monte Carlo (LHMC) to reduce the autocorrelation of the samples. Probabilistic inference involving multi-modal distributions is very difficult for dynamics-based MCMC samplers, which is easily trapped in the mode far away from other modes. To tackle this issue, we further propose a variational hybrid Monte Carlo (VHMC) which uses a variational distribution to explore the phase space and find new modes, and it is capable of sampling from multi-modal distributions effectively. A formal proof is provided that shows that the proposed method can converge to target distributions. Both synthetic and real datasets are used to evaluate its properties and performance. The experimental results verify the theory and show superior performance in multi-modal sampling.
format Online
Article
Text
id pubmed-10138141
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-101381412023-04-28 Variational Hybrid Monte Carlo for Efficient Multi-Modal Data Sampling Sun, Shiliang Zhao, Jing Gu, Minghao Wang, Shanhu Entropy (Basel) Article The Hamiltonian Monte Carlo (HMC) sampling algorithm exploits Hamiltonian dynamics to construct efficient Markov Chain Monte Carlo (MCMC), which has become increasingly popular in machine learning and statistics. Since HMC uses the gradient information of the target distribution, it can explore the state space much more efficiently than random-walk proposals, but may suffer from high autocorrelation. In this paper, we propose Langevin Hamiltonian Monte Carlo (LHMC) to reduce the autocorrelation of the samples. Probabilistic inference involving multi-modal distributions is very difficult for dynamics-based MCMC samplers, which is easily trapped in the mode far away from other modes. To tackle this issue, we further propose a variational hybrid Monte Carlo (VHMC) which uses a variational distribution to explore the phase space and find new modes, and it is capable of sampling from multi-modal distributions effectively. A formal proof is provided that shows that the proposed method can converge to target distributions. Both synthetic and real datasets are used to evaluate its properties and performance. The experimental results verify the theory and show superior performance in multi-modal sampling. MDPI 2023-03-24 /pmc/articles/PMC10138141/ /pubmed/37190347 http://dx.doi.org/10.3390/e25040560 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Sun, Shiliang
Zhao, Jing
Gu, Minghao
Wang, Shanhu
Variational Hybrid Monte Carlo for Efficient Multi-Modal Data Sampling
title Variational Hybrid Monte Carlo for Efficient Multi-Modal Data Sampling
title_full Variational Hybrid Monte Carlo for Efficient Multi-Modal Data Sampling
title_fullStr Variational Hybrid Monte Carlo for Efficient Multi-Modal Data Sampling
title_full_unstemmed Variational Hybrid Monte Carlo for Efficient Multi-Modal Data Sampling
title_short Variational Hybrid Monte Carlo for Efficient Multi-Modal Data Sampling
title_sort variational hybrid monte carlo for efficient multi-modal data sampling
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10138141/
https://www.ncbi.nlm.nih.gov/pubmed/37190347
http://dx.doi.org/10.3390/e25040560
work_keys_str_mv AT sunshiliang variationalhybridmontecarloforefficientmultimodaldatasampling
AT zhaojing variationalhybridmontecarloforefficientmultimodaldatasampling
AT guminghao variationalhybridmontecarloforefficientmultimodaldatasampling
AT wangshanhu variationalhybridmontecarloforefficientmultimodaldatasampling