Cargando…

A Neural Network MCMC Sampler That Maximizes Proposal Entropy

Markov Chain Monte Carlo (MCMC) methods sample from unnormalized probability distributions and offer guarantees of exact sampling. However, in the continuous case, unfavorable geometry of the target distribution can greatly limit the efficiency of MCMC methods. Augmenting samplers with neural networ...

Descripción completa

Detalles Bibliográficos
Autores principales: Li, Zengyi, Chen, Yubei, Sommer, Friedrich T.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7996279/
https://www.ncbi.nlm.nih.gov/pubmed/33668743
http://dx.doi.org/10.3390/e23030269
_version_ 1783670081987805184
author Li, Zengyi
Chen, Yubei
Sommer, Friedrich T.
author_facet Li, Zengyi
Chen, Yubei
Sommer, Friedrich T.
author_sort Li, Zengyi
collection PubMed
description Markov Chain Monte Carlo (MCMC) methods sample from unnormalized probability distributions and offer guarantees of exact sampling. However, in the continuous case, unfavorable geometry of the target distribution can greatly limit the efficiency of MCMC methods. Augmenting samplers with neural networks can potentially improve their efficiency. Previous neural network-based samplers were trained with objectives that either did not explicitly encourage exploration, or contained a term that encouraged exploration but only for well structured distributions. Here we propose to maximize proposal entropy for adapting the proposal to distributions of any shape. To optimize proposal entropy directly, we devised a neural network MCMC sampler that has a flexible and tractable proposal distribution. Specifically, our network architecture utilizes the gradient of the target distribution for generating proposals. Our model achieved significantly higher efficiency than previous neural network MCMC techniques in a variety of sampling tasks, sometimes by more than an order magnitude. Further, the sampler was demonstrated through the training of a convergent energy-based model of natural images. The adaptive sampler achieved unbiased sampling with significantly higher proposal entropy than a Langevin dynamics sample. The trained sampler also achieved better sample quality.
format Online
Article
Text
id pubmed-7996279
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-79962792021-03-27 A Neural Network MCMC Sampler That Maximizes Proposal Entropy Li, Zengyi Chen, Yubei Sommer, Friedrich T. Entropy (Basel) Article Markov Chain Monte Carlo (MCMC) methods sample from unnormalized probability distributions and offer guarantees of exact sampling. However, in the continuous case, unfavorable geometry of the target distribution can greatly limit the efficiency of MCMC methods. Augmenting samplers with neural networks can potentially improve their efficiency. Previous neural network-based samplers were trained with objectives that either did not explicitly encourage exploration, or contained a term that encouraged exploration but only for well structured distributions. Here we propose to maximize proposal entropy for adapting the proposal to distributions of any shape. To optimize proposal entropy directly, we devised a neural network MCMC sampler that has a flexible and tractable proposal distribution. Specifically, our network architecture utilizes the gradient of the target distribution for generating proposals. Our model achieved significantly higher efficiency than previous neural network MCMC techniques in a variety of sampling tasks, sometimes by more than an order magnitude. Further, the sampler was demonstrated through the training of a convergent energy-based model of natural images. The adaptive sampler achieved unbiased sampling with significantly higher proposal entropy than a Langevin dynamics sample. The trained sampler also achieved better sample quality. MDPI 2021-02-25 /pmc/articles/PMC7996279/ /pubmed/33668743 http://dx.doi.org/10.3390/e23030269 Text en © 2021 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) ).
spellingShingle Article
Li, Zengyi
Chen, Yubei
Sommer, Friedrich T.
A Neural Network MCMC Sampler That Maximizes Proposal Entropy
title A Neural Network MCMC Sampler That Maximizes Proposal Entropy
title_full A Neural Network MCMC Sampler That Maximizes Proposal Entropy
title_fullStr A Neural Network MCMC Sampler That Maximizes Proposal Entropy
title_full_unstemmed A Neural Network MCMC Sampler That Maximizes Proposal Entropy
title_short A Neural Network MCMC Sampler That Maximizes Proposal Entropy
title_sort neural network mcmc sampler that maximizes proposal entropy
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7996279/
https://www.ncbi.nlm.nih.gov/pubmed/33668743
http://dx.doi.org/10.3390/e23030269
work_keys_str_mv AT lizengyi aneuralnetworkmcmcsamplerthatmaximizesproposalentropy
AT chenyubei aneuralnetworkmcmcsamplerthatmaximizesproposalentropy
AT sommerfriedricht aneuralnetworkmcmcsamplerthatmaximizesproposalentropy
AT lizengyi neuralnetworkmcmcsamplerthatmaximizesproposalentropy
AT chenyubei neuralnetworkmcmcsamplerthatmaximizesproposalentropy
AT sommerfriedricht neuralnetworkmcmcsamplerthatmaximizesproposalentropy