Cargando…

Variational Information Bottleneck for Unsupervised Clustering: Deep Gaussian Mixture Embedding

In this paper, we develop an unsupervised generative clustering framework that combines the variational information bottleneck and the Gaussian mixture model. Specifically, in our approach, we use the variational information bottleneck method and model the latent space as a mixture of Gaussians. We...

Descripción completa

Detalles Bibliográficos
Autores principales: Uğur, Yiğit, Arvanitakis, George, Zaidi, Abdellatif
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7516645/
https://www.ncbi.nlm.nih.gov/pubmed/33285988
http://dx.doi.org/10.3390/e22020213
Descripción
Sumario:In this paper, we develop an unsupervised generative clustering framework that combines the variational information bottleneck and the Gaussian mixture model. Specifically, in our approach, we use the variational information bottleneck method and model the latent space as a mixture of Gaussians. We derive a bound on the cost function of our model that generalizes the Evidence Lower Bound (ELBO) and provide a variational inference type algorithm that allows computing it. In the algorithm, the coders’ mappings are parametrized using neural networks, and the bound is approximated by Markov sampling and optimized with stochastic gradient descent. Numerical results on real datasets are provided to support the efficiency of our method.