Cargando…
Evolutionary Variational Optimization of Generative Models
We combine two popular optimization approaches to derive learning algorithms for generative models: variational optimization and evolutionary algorithms. The combination is realized for generative models with discrete latents by using truncated posteriors as the family of variational distributions....
Autores principales: | , , , |
---|---|
Lenguaje: | eng |
Publicado: |
2022
|
Materias: | |
Acceso en línea: | http://cds.cern.ch/record/2851293 |
_version_ | 1780977110801711104 |
---|---|
author | Drefs, Jakob Guiraud, Enrico Lücke, Jörg Wood, Frank |
author_facet | Drefs, Jakob Guiraud, Enrico Lücke, Jörg Wood, Frank |
author_sort | Drefs, Jakob |
collection | CERN |
description | We combine two popular optimization approaches to derive learning algorithms for generative models: variational optimization and evolutionary algorithms. The combination is realized for generative models with discrete latents by using truncated posteriors as the family of variational distributions. The variational parameters of truncated posteriors are sets of latent states. By interpreting these states as genomes of individuals and by using the variational lower bound to define a fitness, we can apply evolutionary algorithms to realize the variational loop. The used variational distributions are very flexible and we show that evolutionary algorithms can effectively and efficiently optimize the variational bound. Furthermore, the variational loop is generally applicable (“black box”) with no analytical derivations required. To show general applicability, we apply the approach to three generative models (we use Noisy-OR Bayes Nets, Binary Sparse Coding, and Spike-and-Slab Sparse Coding). To demonstrate effectiveness and efficiency of the novel variational approach, we use the standard competitive benchmarks of image denoising and inpainting. The benchmarks allow quantitative comparisons to a wide range of methods including probabilistic approaches, deep deterministic and generative networks, and non-local image processing methods. In the category of “zero-shot” learning (when only the corrupted image is used for training), we observed the evolutionary variational algorithm to significantly improve the state-of-the-art in many benchmark settings. For one well-known inpainting benchmark, we also observed state-of-the-art performance across all categories of algorithms although we only train on the corrupted image. In general, our investigations highlight the importance of research on optimization methods for generative models to achieve performance improvements. |
id | cern-2851293 |
institution | Organización Europea para la Investigación Nuclear |
language | eng |
publishDate | 2022 |
record_format | invenio |
spelling | cern-28512932023-08-09T10:54:09Zhttp://cds.cern.ch/record/2851293engDrefs, JakobGuiraud, EnricoLücke, JörgWood, FrankEvolutionary Variational Optimization of Generative ModelsComputing and ComputersWe combine two popular optimization approaches to derive learning algorithms for generative models: variational optimization and evolutionary algorithms. The combination is realized for generative models with discrete latents by using truncated posteriors as the family of variational distributions. The variational parameters of truncated posteriors are sets of latent states. By interpreting these states as genomes of individuals and by using the variational lower bound to define a fitness, we can apply evolutionary algorithms to realize the variational loop. The used variational distributions are very flexible and we show that evolutionary algorithms can effectively and efficiently optimize the variational bound. Furthermore, the variational loop is generally applicable (“black box”) with no analytical derivations required. To show general applicability, we apply the approach to three generative models (we use Noisy-OR Bayes Nets, Binary Sparse Coding, and Spike-and-Slab Sparse Coding). To demonstrate effectiveness and efficiency of the novel variational approach, we use the standard competitive benchmarks of image denoising and inpainting. The benchmarks allow quantitative comparisons to a wide range of methods including probabilistic approaches, deep deterministic and generative networks, and non-local image processing methods. In the category of “zero-shot” learning (when only the corrupted image is used for training), we observed the evolutionary variational algorithm to significantly improve the state-of-the-art in many benchmark settings. For one well-known inpainting benchmark, we also observed state-of-the-art performance across all categories of algorithms although we only train on the corrupted image. In general, our investigations highlight the importance of research on optimization methods for generative models to achieve performance improvements.oai:cds.cern.ch:28512932022 |
spellingShingle | Computing and Computers Drefs, Jakob Guiraud, Enrico Lücke, Jörg Wood, Frank Evolutionary Variational Optimization of Generative Models |
title | Evolutionary Variational Optimization of Generative Models |
title_full | Evolutionary Variational Optimization of Generative Models |
title_fullStr | Evolutionary Variational Optimization of Generative Models |
title_full_unstemmed | Evolutionary Variational Optimization of Generative Models |
title_short | Evolutionary Variational Optimization of Generative Models |
title_sort | evolutionary variational optimization of generative models |
topic | Computing and Computers |
url | http://cds.cern.ch/record/2851293 |
work_keys_str_mv | AT drefsjakob evolutionaryvariationaloptimizationofgenerativemodels AT guiraudenrico evolutionaryvariationaloptimizationofgenerativemodels AT luckejorg evolutionaryvariationaloptimizationofgenerativemodels AT woodfrank evolutionaryvariationaloptimizationofgenerativemodels |