Cargando…
Using deep LSD to build operators in GANs latent space with meaning in real space
Generative models rely on the idea that data can be represented in terms of latent variables which are uncorrelated by definition. Lack of correlation among the latent variable support is important because it suggests that the latent-space manifold is simpler to understand and manipulate than the re...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10309997/ https://www.ncbi.nlm.nih.gov/pubmed/37384721 http://dx.doi.org/10.1371/journal.pone.0287736 |
_version_ | 1785066492690169856 |
---|---|
author | Toledo-Marín, J. Quetzalcóatl Glazier, James A. |
author_facet | Toledo-Marín, J. Quetzalcóatl Glazier, James A. |
author_sort | Toledo-Marín, J. Quetzalcóatl |
collection | PubMed |
description | Generative models rely on the idea that data can be represented in terms of latent variables which are uncorrelated by definition. Lack of correlation among the latent variable support is important because it suggests that the latent-space manifold is simpler to understand and manipulate than the real-space representation. Many types of generative model are used in deep learning, e.g., variational autoencoders (VAEs) and generative adversarial networks (GANs). Based on the idea that the latent space behaves like a vector space Radford et al. (2015), we ask whether we can expand the latent space representation of our data elements in terms of an orthonormal basis set. Here we propose a method to build a set of linearly independent vectors in the latent space of a trained GAN, which we call quasi-eigenvectors. These quasi-eigenvectors have two key properties: i) They span the latent space, ii) A set of these quasi-eigenvectors map to each of the labeled features one-to-one. We show that in the case of the MNIST image data set, while the number of dimensions in latent space is large by design, 98% of the data in real space map to a sub-domain of latent space of dimensionality equal to the number of labels. We then show how the quasi-eigenvectors can be used for Latent Spectral Decomposition (LSD). We apply LSD to denoise MNIST images. Finally, using the quasi-eigenvectors, we construct rotation matrices in latent space which map to feature transformations in real space. Overall, from quasi-eigenvectors we gain insight regarding the latent space topology. |
format | Online Article Text |
id | pubmed-10309997 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | Public Library of Science |
record_format | MEDLINE/PubMed |
spelling | pubmed-103099972023-06-30 Using deep LSD to build operators in GANs latent space with meaning in real space Toledo-Marín, J. Quetzalcóatl Glazier, James A. PLoS One Research Article Generative models rely on the idea that data can be represented in terms of latent variables which are uncorrelated by definition. Lack of correlation among the latent variable support is important because it suggests that the latent-space manifold is simpler to understand and manipulate than the real-space representation. Many types of generative model are used in deep learning, e.g., variational autoencoders (VAEs) and generative adversarial networks (GANs). Based on the idea that the latent space behaves like a vector space Radford et al. (2015), we ask whether we can expand the latent space representation of our data elements in terms of an orthonormal basis set. Here we propose a method to build a set of linearly independent vectors in the latent space of a trained GAN, which we call quasi-eigenvectors. These quasi-eigenvectors have two key properties: i) They span the latent space, ii) A set of these quasi-eigenvectors map to each of the labeled features one-to-one. We show that in the case of the MNIST image data set, while the number of dimensions in latent space is large by design, 98% of the data in real space map to a sub-domain of latent space of dimensionality equal to the number of labels. We then show how the quasi-eigenvectors can be used for Latent Spectral Decomposition (LSD). We apply LSD to denoise MNIST images. Finally, using the quasi-eigenvectors, we construct rotation matrices in latent space which map to feature transformations in real space. Overall, from quasi-eigenvectors we gain insight regarding the latent space topology. Public Library of Science 2023-06-29 /pmc/articles/PMC10309997/ /pubmed/37384721 http://dx.doi.org/10.1371/journal.pone.0287736 Text en © 2023 Toledo-Marín, Glazier https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. |
spellingShingle | Research Article Toledo-Marín, J. Quetzalcóatl Glazier, James A. Using deep LSD to build operators in GANs latent space with meaning in real space |
title | Using deep LSD to build operators in GANs latent space with meaning in real space |
title_full | Using deep LSD to build operators in GANs latent space with meaning in real space |
title_fullStr | Using deep LSD to build operators in GANs latent space with meaning in real space |
title_full_unstemmed | Using deep LSD to build operators in GANs latent space with meaning in real space |
title_short | Using deep LSD to build operators in GANs latent space with meaning in real space |
title_sort | using deep lsd to build operators in gans latent space with meaning in real space |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10309997/ https://www.ncbi.nlm.nih.gov/pubmed/37384721 http://dx.doi.org/10.1371/journal.pone.0287736 |
work_keys_str_mv | AT toledomarinjquetzalcoatl usingdeeplsdtobuildoperatorsinganslatentspacewithmeaninginrealspace AT glazierjamesa usingdeeplsdtobuildoperatorsinganslatentspacewithmeaninginrealspace |