Cargando…

PriorVAE: encoding spatial priors with variational autoencoders for small-area estimation

Gaussian processes (GPs), implemented through multivariate Gaussian distributions for a finite collection of data, are the most popular approach in small-area spatial statistical modelling. In this context, they are used to encode correlation structures over space and can generalize well in interpol...

Descripción completa

Detalles Bibliográficos
Autores principales: Semenova, Elizaveta, Xu, Yidan, Howes, Adam, Rashid, Theo, Bhatt, Samir, Mishra, Swapnil, Flaxman, Seth
Formato: Online Artículo Texto
Lenguaje:English
Publicado: The Royal Society 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9174721/
https://www.ncbi.nlm.nih.gov/pubmed/35673858
http://dx.doi.org/10.1098/rsif.2022.0094
_version_ 1784722301163405312
author Semenova, Elizaveta
Xu, Yidan
Howes, Adam
Rashid, Theo
Bhatt, Samir
Mishra, Swapnil
Flaxman, Seth
author_facet Semenova, Elizaveta
Xu, Yidan
Howes, Adam
Rashid, Theo
Bhatt, Samir
Mishra, Swapnil
Flaxman, Seth
author_sort Semenova, Elizaveta
collection PubMed
description Gaussian processes (GPs), implemented through multivariate Gaussian distributions for a finite collection of data, are the most popular approach in small-area spatial statistical modelling. In this context, they are used to encode correlation structures over space and can generalize well in interpolation tasks. Despite their flexibility, off-the-shelf GPs present serious computational challenges which limit their scalability and practical usefulness in applied settings. Here, we propose a novel, deep generative modelling approach to tackle this challenge, termed PriorVAE: for a particular spatial setting, we approximate a class of GP priors through prior sampling and subsequent fitting of a variational autoencoder (VAE). Given a trained VAE, the resultant decoder allows spatial inference to become incredibly efficient due to the low dimensional, independently distributed latent Gaussian space representation of the VAE. Once trained, inference using the VAE decoder replaces the GP within a Bayesian sampling framework. This approach provides tractable and easy-to-implement means of approximately encoding spatial priors and facilitates efficient statistical inference. We demonstrate the utility of our VAE two-stage approach on Bayesian, small-area estimation tasks.
format Online
Article
Text
id pubmed-9174721
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher The Royal Society
record_format MEDLINE/PubMed
spelling pubmed-91747212022-06-08 PriorVAE: encoding spatial priors with variational autoencoders for small-area estimation Semenova, Elizaveta Xu, Yidan Howes, Adam Rashid, Theo Bhatt, Samir Mishra, Swapnil Flaxman, Seth J R Soc Interface Life Sciences–Mathematics interface Gaussian processes (GPs), implemented through multivariate Gaussian distributions for a finite collection of data, are the most popular approach in small-area spatial statistical modelling. In this context, they are used to encode correlation structures over space and can generalize well in interpolation tasks. Despite their flexibility, off-the-shelf GPs present serious computational challenges which limit their scalability and practical usefulness in applied settings. Here, we propose a novel, deep generative modelling approach to tackle this challenge, termed PriorVAE: for a particular spatial setting, we approximate a class of GP priors through prior sampling and subsequent fitting of a variational autoencoder (VAE). Given a trained VAE, the resultant decoder allows spatial inference to become incredibly efficient due to the low dimensional, independently distributed latent Gaussian space representation of the VAE. Once trained, inference using the VAE decoder replaces the GP within a Bayesian sampling framework. This approach provides tractable and easy-to-implement means of approximately encoding spatial priors and facilitates efficient statistical inference. We demonstrate the utility of our VAE two-stage approach on Bayesian, small-area estimation tasks. The Royal Society 2022-06-08 /pmc/articles/PMC9174721/ /pubmed/35673858 http://dx.doi.org/10.1098/rsif.2022.0094 Text en © 2022 The Authors. https://creativecommons.org/licenses/by/4.0/Published by the Royal Society under the terms of the Creative Commons Attribution License http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, provided the original author and source are credited.
spellingShingle Life Sciences–Mathematics interface
Semenova, Elizaveta
Xu, Yidan
Howes, Adam
Rashid, Theo
Bhatt, Samir
Mishra, Swapnil
Flaxman, Seth
PriorVAE: encoding spatial priors with variational autoencoders for small-area estimation
title PriorVAE: encoding spatial priors with variational autoencoders for small-area estimation
title_full PriorVAE: encoding spatial priors with variational autoencoders for small-area estimation
title_fullStr PriorVAE: encoding spatial priors with variational autoencoders for small-area estimation
title_full_unstemmed PriorVAE: encoding spatial priors with variational autoencoders for small-area estimation
title_short PriorVAE: encoding spatial priors with variational autoencoders for small-area estimation
title_sort priorvae: encoding spatial priors with variational autoencoders for small-area estimation
topic Life Sciences–Mathematics interface
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9174721/
https://www.ncbi.nlm.nih.gov/pubmed/35673858
http://dx.doi.org/10.1098/rsif.2022.0094
work_keys_str_mv AT semenovaelizaveta priorvaeencodingspatialpriorswithvariationalautoencodersforsmallareaestimation
AT xuyidan priorvaeencodingspatialpriorswithvariationalautoencodersforsmallareaestimation
AT howesadam priorvaeencodingspatialpriorswithvariationalautoencodersforsmallareaestimation
AT rashidtheo priorvaeencodingspatialpriorswithvariationalautoencodersforsmallareaestimation
AT bhattsamir priorvaeencodingspatialpriorswithvariationalautoencodersforsmallareaestimation
AT mishraswapnil priorvaeencodingspatialpriorswithvariationalautoencodersforsmallareaestimation
AT flaxmanseth priorvaeencodingspatialpriorswithvariationalautoencodersforsmallareaestimation