Cargando…

Dynamical Sampling with Langevin Normalization Flows

In Bayesian machine learning, sampling methods provide the asymptotically unbiased estimation for the inference of the complex probability distributions, where Markov chain Monte Carlo (MCMC) is one of the most popular sampling methods. However, MCMC can lead to high autocorrelation of samples or po...

Descripción completa

Detalles Bibliográficos
Autores principales: Gu, Minghao, Sun, Shiliang, Liu, Yan
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7514440/
http://dx.doi.org/10.3390/e21111096
_version_ 1783586588655091712
author Gu, Minghao
Sun, Shiliang
Liu, Yan
author_facet Gu, Minghao
Sun, Shiliang
Liu, Yan
author_sort Gu, Minghao
collection PubMed
description In Bayesian machine learning, sampling methods provide the asymptotically unbiased estimation for the inference of the complex probability distributions, where Markov chain Monte Carlo (MCMC) is one of the most popular sampling methods. However, MCMC can lead to high autocorrelation of samples or poor performances in some complex distributions. In this paper, we introduce Langevin diffusions to normalization flows to construct a brand-new dynamical sampling method. We propose the modified Kullback-Leibler divergence as the loss function to train the sampler, which ensures that the samples generated from the proposed method can converge to the target distribution. Since the gradient function of the target distribution is used during the process of calculating the modified Kullback-Leibler, which makes the integral of the modified Kullback-Leibler intractable. We utilize the Monte Carlo estimator to approximate this integral. We also discuss the situation when the target distribution is unnormalized. We illustrate the properties and performances of the proposed method on varieties of complex distributions and real datasets. The experiments indicate that the proposed method not only takes the advantage of the flexibility of neural networks but also utilizes the property of rapid convergence to the target distribution of the dynamics system and demonstrate superior performances competing with dynamics based MCMC samplers.
format Online
Article
Text
id pubmed-7514440
institution National Center for Biotechnology Information
language English
publishDate 2019
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-75144402020-11-09 Dynamical Sampling with Langevin Normalization Flows Gu, Minghao Sun, Shiliang Liu, Yan Entropy (Basel) Article In Bayesian machine learning, sampling methods provide the asymptotically unbiased estimation for the inference of the complex probability distributions, where Markov chain Monte Carlo (MCMC) is one of the most popular sampling methods. However, MCMC can lead to high autocorrelation of samples or poor performances in some complex distributions. In this paper, we introduce Langevin diffusions to normalization flows to construct a brand-new dynamical sampling method. We propose the modified Kullback-Leibler divergence as the loss function to train the sampler, which ensures that the samples generated from the proposed method can converge to the target distribution. Since the gradient function of the target distribution is used during the process of calculating the modified Kullback-Leibler, which makes the integral of the modified Kullback-Leibler intractable. We utilize the Monte Carlo estimator to approximate this integral. We also discuss the situation when the target distribution is unnormalized. We illustrate the properties and performances of the proposed method on varieties of complex distributions and real datasets. The experiments indicate that the proposed method not only takes the advantage of the flexibility of neural networks but also utilizes the property of rapid convergence to the target distribution of the dynamics system and demonstrate superior performances competing with dynamics based MCMC samplers. MDPI 2019-11-10 /pmc/articles/PMC7514440/ http://dx.doi.org/10.3390/e21111096 Text en © 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Gu, Minghao
Sun, Shiliang
Liu, Yan
Dynamical Sampling with Langevin Normalization Flows
title Dynamical Sampling with Langevin Normalization Flows
title_full Dynamical Sampling with Langevin Normalization Flows
title_fullStr Dynamical Sampling with Langevin Normalization Flows
title_full_unstemmed Dynamical Sampling with Langevin Normalization Flows
title_short Dynamical Sampling with Langevin Normalization Flows
title_sort dynamical sampling with langevin normalization flows
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7514440/
http://dx.doi.org/10.3390/e21111096
work_keys_str_mv AT guminghao dynamicalsamplingwithlangevinnormalizationflows
AT sunshiliang dynamicalsamplingwithlangevinnormalizationflows
AT liuyan dynamicalsamplingwithlangevinnormalizationflows