Cargando…

Bayesian Reasoning with Trained Neural Networks

We showed how to use trained neural networks to perform Bayesian reasoning in order to solve tasks outside their initial scope. Deep generative models provide prior knowledge, and classification/regression networks impose constraints. The tasks at hand were formulated as Bayesian inference problems,...

Descripción completa

Detalles Bibliográficos
Autores principales: Knollmüller, Jakob, Enßlin, Torsten A.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8229912/
https://www.ncbi.nlm.nih.gov/pubmed/34073066
http://dx.doi.org/10.3390/e23060693
_version_ 1783713082833371136
author Knollmüller, Jakob
Enßlin, Torsten A.
author_facet Knollmüller, Jakob
Enßlin, Torsten A.
author_sort Knollmüller, Jakob
collection PubMed
description We showed how to use trained neural networks to perform Bayesian reasoning in order to solve tasks outside their initial scope. Deep generative models provide prior knowledge, and classification/regression networks impose constraints. The tasks at hand were formulated as Bayesian inference problems, which we approximately solved through variational or sampling techniques. The approach built on top of already trained networks, and the addressable questions grew super-exponentially with the number of available networks. In its simplest form, the approach yielded conditional generative models. However, multiple simultaneous constraints constitute elaborate questions. We compared the approach to specifically trained generators, showed how to solve riddles, and demonstrated its compatibility with state-of-the-art architectures.
format Online
Article
Text
id pubmed-8229912
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-82299122021-06-26 Bayesian Reasoning with Trained Neural Networks Knollmüller, Jakob Enßlin, Torsten A. Entropy (Basel) Article We showed how to use trained neural networks to perform Bayesian reasoning in order to solve tasks outside their initial scope. Deep generative models provide prior knowledge, and classification/regression networks impose constraints. The tasks at hand were formulated as Bayesian inference problems, which we approximately solved through variational or sampling techniques. The approach built on top of already trained networks, and the addressable questions grew super-exponentially with the number of available networks. In its simplest form, the approach yielded conditional generative models. However, multiple simultaneous constraints constitute elaborate questions. We compared the approach to specifically trained generators, showed how to solve riddles, and demonstrated its compatibility with state-of-the-art architectures. MDPI 2021-05-31 /pmc/articles/PMC8229912/ /pubmed/34073066 http://dx.doi.org/10.3390/e23060693 Text en © 2021 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Knollmüller, Jakob
Enßlin, Torsten A.
Bayesian Reasoning with Trained Neural Networks
title Bayesian Reasoning with Trained Neural Networks
title_full Bayesian Reasoning with Trained Neural Networks
title_fullStr Bayesian Reasoning with Trained Neural Networks
title_full_unstemmed Bayesian Reasoning with Trained Neural Networks
title_short Bayesian Reasoning with Trained Neural Networks
title_sort bayesian reasoning with trained neural networks
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8229912/
https://www.ncbi.nlm.nih.gov/pubmed/34073066
http://dx.doi.org/10.3390/e23060693
work_keys_str_mv AT knollmullerjakob bayesianreasoningwithtrainedneuralnetworks
AT enßlintorstena bayesianreasoningwithtrainedneuralnetworks