Cargando…

No Statistical-Computational Gap in Spiked Matrix Models with Generative Network Priors †

We provide a non-asymptotic analysis of the spiked Wishart and Wigner matrix models with a generative neural network prior. Spiked random matrices have the form of a rank-one signal plus noise and have been used as models for high dimensional Principal Component Analysis (PCA), community detection a...

Descripción completa

Detalles Bibliográficos
Autores principales: Cocola, Jorio, Hand, Paul, Voroninski, Vladislav
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7830301/
https://www.ncbi.nlm.nih.gov/pubmed/33467175
http://dx.doi.org/10.3390/e23010115
_version_ 1783641377767161856
author Cocola, Jorio
Hand, Paul
Voroninski, Vladislav
author_facet Cocola, Jorio
Hand, Paul
Voroninski, Vladislav
author_sort Cocola, Jorio
collection PubMed
description We provide a non-asymptotic analysis of the spiked Wishart and Wigner matrix models with a generative neural network prior. Spiked random matrices have the form of a rank-one signal plus noise and have been used as models for high dimensional Principal Component Analysis (PCA), community detection and synchronization over groups. Depending on the prior imposed on the spike, these models can display a statistical-computational gap between the information theoretically optimal reconstruction error that can be achieved with unbounded computational resources and the sub-optimal performances of currently known polynomial time algorithms. These gaps are believed to be fundamental, as in the emblematic case of Sparse PCA. In stark contrast to such cases, we show that there is no statistical-computational gap under a generative network prior, in which the spike lies on the range of a generative neural network. Specifically, we analyze a gradient descent method for minimizing a nonlinear least squares objective over the range of an expansive-Gaussian neural network and show that it can recover in polynomial time an estimate of the underlying spike with a rate-optimal sample complexity and dependence on the noise level.
format Online
Article
Text
id pubmed-7830301
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-78303012021-02-24 No Statistical-Computational Gap in Spiked Matrix Models with Generative Network Priors † Cocola, Jorio Hand, Paul Voroninski, Vladislav Entropy (Basel) Article We provide a non-asymptotic analysis of the spiked Wishart and Wigner matrix models with a generative neural network prior. Spiked random matrices have the form of a rank-one signal plus noise and have been used as models for high dimensional Principal Component Analysis (PCA), community detection and synchronization over groups. Depending on the prior imposed on the spike, these models can display a statistical-computational gap between the information theoretically optimal reconstruction error that can be achieved with unbounded computational resources and the sub-optimal performances of currently known polynomial time algorithms. These gaps are believed to be fundamental, as in the emblematic case of Sparse PCA. In stark contrast to such cases, we show that there is no statistical-computational gap under a generative network prior, in which the spike lies on the range of a generative neural network. Specifically, we analyze a gradient descent method for minimizing a nonlinear least squares objective over the range of an expansive-Gaussian neural network and show that it can recover in polynomial time an estimate of the underlying spike with a rate-optimal sample complexity and dependence on the noise level. MDPI 2021-01-16 /pmc/articles/PMC7830301/ /pubmed/33467175 http://dx.doi.org/10.3390/e23010115 Text en © 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Cocola, Jorio
Hand, Paul
Voroninski, Vladislav
No Statistical-Computational Gap in Spiked Matrix Models with Generative Network Priors †
title No Statistical-Computational Gap in Spiked Matrix Models with Generative Network Priors †
title_full No Statistical-Computational Gap in Spiked Matrix Models with Generative Network Priors †
title_fullStr No Statistical-Computational Gap in Spiked Matrix Models with Generative Network Priors †
title_full_unstemmed No Statistical-Computational Gap in Spiked Matrix Models with Generative Network Priors †
title_short No Statistical-Computational Gap in Spiked Matrix Models with Generative Network Priors †
title_sort no statistical-computational gap in spiked matrix models with generative network priors †
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7830301/
https://www.ncbi.nlm.nih.gov/pubmed/33467175
http://dx.doi.org/10.3390/e23010115
work_keys_str_mv AT cocolajorio nostatisticalcomputationalgapinspikedmatrixmodelswithgenerativenetworkpriors
AT handpaul nostatisticalcomputationalgapinspikedmatrixmodelswithgenerativenetworkpriors
AT voroninskivladislav nostatisticalcomputationalgapinspikedmatrixmodelswithgenerativenetworkpriors