Cargando…

Variational Inference via Rényi Bound Optimization and Multiple-Source Adaptation †

Variational inference provides a way to approximate probability densities through optimization. It does so by optimizing an upper or a lower bound of the likelihood of the observed data (the evidence). The classic variational inference approach suggests maximizing the Evidence Lower Bound (ELBO). Re...

Descripción completa

Detalles Bibliográficos
Autores principales: Zalman (Oshri), Dana, Fine, Shai
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10606691/
https://www.ncbi.nlm.nih.gov/pubmed/37895589
http://dx.doi.org/10.3390/e25101468
_version_ 1785127376456253440
author Zalman (Oshri), Dana
Fine, Shai
author_facet Zalman (Oshri), Dana
Fine, Shai
author_sort Zalman (Oshri), Dana
collection PubMed
description Variational inference provides a way to approximate probability densities through optimization. It does so by optimizing an upper or a lower bound of the likelihood of the observed data (the evidence). The classic variational inference approach suggests maximizing the Evidence Lower Bound (ELBO). Recent studies proposed to optimize the variational Rényi bound (VR) and the [Formula: see text] upper bound. However, these estimates, which are based on the Monte Carlo (MC) approximation, either underestimate the bound or exhibit a high variance. In this work, we introduce a new upper bound, termed the Variational Rényi Log Upper bound (VRLU), which is based on the existing VR bound. In contrast to the existing VR bound, the MC approximation of the VRLU bound maintains the upper bound property. Furthermore, we devise a (sandwiched) upper–lower bound variational inference method, termed the Variational Rényi Sandwich (VRS), to jointly optimize the upper and lower bounds. We present a set of experiments, designed to evaluate the new VRLU bound and to compare the VRS method with the classic Variational Autoencoder (VAE) and the VR methods. Next, we apply the VRS approximation to the Multiple-Source Adaptation problem (MSA). MSA is a real-world scenario where data are collected from multiple sources that differ from one another by their probability distribution over the input space. The main aim is to combine fairly accurate predictive models from these sources and create an accurate model for new, mixed target domains. However, many domain adaptation methods assume prior knowledge of the data distribution in the source domains. In this work, we apply the suggested VRS density estimate to the Multiple-Source Adaptation problem (MSA) and show, both theoretically and empirically, that it provides tighter error bounds and improved performance, compared to leading MSA methods.
format Online
Article
Text
id pubmed-10606691
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-106066912023-10-28 Variational Inference via Rényi Bound Optimization and Multiple-Source Adaptation † Zalman (Oshri), Dana Fine, Shai Entropy (Basel) Article Variational inference provides a way to approximate probability densities through optimization. It does so by optimizing an upper or a lower bound of the likelihood of the observed data (the evidence). The classic variational inference approach suggests maximizing the Evidence Lower Bound (ELBO). Recent studies proposed to optimize the variational Rényi bound (VR) and the [Formula: see text] upper bound. However, these estimates, which are based on the Monte Carlo (MC) approximation, either underestimate the bound or exhibit a high variance. In this work, we introduce a new upper bound, termed the Variational Rényi Log Upper bound (VRLU), which is based on the existing VR bound. In contrast to the existing VR bound, the MC approximation of the VRLU bound maintains the upper bound property. Furthermore, we devise a (sandwiched) upper–lower bound variational inference method, termed the Variational Rényi Sandwich (VRS), to jointly optimize the upper and lower bounds. We present a set of experiments, designed to evaluate the new VRLU bound and to compare the VRS method with the classic Variational Autoencoder (VAE) and the VR methods. Next, we apply the VRS approximation to the Multiple-Source Adaptation problem (MSA). MSA is a real-world scenario where data are collected from multiple sources that differ from one another by their probability distribution over the input space. The main aim is to combine fairly accurate predictive models from these sources and create an accurate model for new, mixed target domains. However, many domain adaptation methods assume prior knowledge of the data distribution in the source domains. In this work, we apply the suggested VRS density estimate to the Multiple-Source Adaptation problem (MSA) and show, both theoretically and empirically, that it provides tighter error bounds and improved performance, compared to leading MSA methods. MDPI 2023-10-20 /pmc/articles/PMC10606691/ /pubmed/37895589 http://dx.doi.org/10.3390/e25101468 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Zalman (Oshri), Dana
Fine, Shai
Variational Inference via Rényi Bound Optimization and Multiple-Source Adaptation †
title Variational Inference via Rényi Bound Optimization and Multiple-Source Adaptation †
title_full Variational Inference via Rényi Bound Optimization and Multiple-Source Adaptation †
title_fullStr Variational Inference via Rényi Bound Optimization and Multiple-Source Adaptation †
title_full_unstemmed Variational Inference via Rényi Bound Optimization and Multiple-Source Adaptation †
title_short Variational Inference via Rényi Bound Optimization and Multiple-Source Adaptation †
title_sort variational inference via rényi bound optimization and multiple-source adaptation †
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10606691/
https://www.ncbi.nlm.nih.gov/pubmed/37895589
http://dx.doi.org/10.3390/e25101468
work_keys_str_mv AT zalmanoshridana variationalinferenceviarenyiboundoptimizationandmultiplesourceadaptation
AT fineshai variationalinferenceviarenyiboundoptimizationandmultiplesourceadaptation