Cargando…
Multi-view Deep Gaussian Process with a Pre-training Acceleration Technique
Deep Gaussian process (DGP) is one of the popular probabilistic modeling methods, which is powerful and widely used for function approximation and uncertainty estimation. However, the traditional DGP lacks consideration for multi-view cases in which data may come from different sources or be constru...
Autores principales: | , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7206310/ http://dx.doi.org/10.1007/978-3-030-47436-2_23 |
_version_ | 1783530391781507072 |
---|---|
author | Zhu, Han Zhao, Jing Sun, Shiliang |
author_facet | Zhu, Han Zhao, Jing Sun, Shiliang |
author_sort | Zhu, Han |
collection | PubMed |
description | Deep Gaussian process (DGP) is one of the popular probabilistic modeling methods, which is powerful and widely used for function approximation and uncertainty estimation. However, the traditional DGP lacks consideration for multi-view cases in which data may come from different sources or be constructed by different types of features. In this paper, we propose a generalized multi-view DGP (MvDGP) to capture the characteristics of different views and model data in different views discriminately. In order to make the proposed model more efficient in training, we introduce a pre-training network in MvDGP and incorporate stochastic variational inference for fine-tuning. Experimental results on real-world data sets demonstrate that pre-trained MvDGP outperforms the state-of-the-art DGP models and deep neural networks, achieving higher computational efficiency than other DGP models. |
format | Online Article Text |
id | pubmed-7206310 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2020 |
record_format | MEDLINE/PubMed |
spelling | pubmed-72063102020-05-08 Multi-view Deep Gaussian Process with a Pre-training Acceleration Technique Zhu, Han Zhao, Jing Sun, Shiliang Advances in Knowledge Discovery and Data Mining Article Deep Gaussian process (DGP) is one of the popular probabilistic modeling methods, which is powerful and widely used for function approximation and uncertainty estimation. However, the traditional DGP lacks consideration for multi-view cases in which data may come from different sources or be constructed by different types of features. In this paper, we propose a generalized multi-view DGP (MvDGP) to capture the characteristics of different views and model data in different views discriminately. In order to make the proposed model more efficient in training, we introduce a pre-training network in MvDGP and incorporate stochastic variational inference for fine-tuning. Experimental results on real-world data sets demonstrate that pre-trained MvDGP outperforms the state-of-the-art DGP models and deep neural networks, achieving higher computational efficiency than other DGP models. 2020-04-17 /pmc/articles/PMC7206310/ http://dx.doi.org/10.1007/978-3-030-47436-2_23 Text en © Springer Nature Switzerland AG 2020 This article is made available via the PMC Open Access Subset for unrestricted research re-use and secondary analysis in any form or by any means with acknowledgement of the original source. These permissions are granted for the duration of the World Health Organization (WHO) declaration of COVID-19 as a global pandemic. |
spellingShingle | Article Zhu, Han Zhao, Jing Sun, Shiliang Multi-view Deep Gaussian Process with a Pre-training Acceleration Technique |
title | Multi-view Deep Gaussian Process with a Pre-training Acceleration Technique |
title_full | Multi-view Deep Gaussian Process with a Pre-training Acceleration Technique |
title_fullStr | Multi-view Deep Gaussian Process with a Pre-training Acceleration Technique |
title_full_unstemmed | Multi-view Deep Gaussian Process with a Pre-training Acceleration Technique |
title_short | Multi-view Deep Gaussian Process with a Pre-training Acceleration Technique |
title_sort | multi-view deep gaussian process with a pre-training acceleration technique |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7206310/ http://dx.doi.org/10.1007/978-3-030-47436-2_23 |
work_keys_str_mv | AT zhuhan multiviewdeepgaussianprocesswithapretrainingaccelerationtechnique AT zhaojing multiviewdeepgaussianprocesswithapretrainingaccelerationtechnique AT sunshiliang multiviewdeepgaussianprocesswithapretrainingaccelerationtechnique |