Cargando…

From here to infinity: sparse finite versus Dirichlet process mixtures in model-based clustering

In model-based clustering mixture models are used to group data points into clusters. A useful concept introduced for Gaussian mixtures by Malsiner Walli et al. (Stat Comput 26:303–324, 2016) are sparse finite mixtures, where the prior distribution on the weight distribution of a mixture with K comp...

Descripción completa

Detalles Bibliográficos
Autores principales: Frühwirth-Schnatter, Sylvia, Malsiner-Walli, Gertraud
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Springer Berlin Heidelberg 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6448299/
https://www.ncbi.nlm.nih.gov/pubmed/31007770
http://dx.doi.org/10.1007/s11634-018-0329-y
_version_ 1783408673907802112
author Frühwirth-Schnatter, Sylvia
Malsiner-Walli, Gertraud
author_facet Frühwirth-Schnatter, Sylvia
Malsiner-Walli, Gertraud
author_sort Frühwirth-Schnatter, Sylvia
collection PubMed
description In model-based clustering mixture models are used to group data points into clusters. A useful concept introduced for Gaussian mixtures by Malsiner Walli et al. (Stat Comput 26:303–324, 2016) are sparse finite mixtures, where the prior distribution on the weight distribution of a mixture with K components is chosen in such a way that a priori the number of clusters in the data is random and is allowed to be smaller than K with high probability. The number of clusters is then inferred a posteriori from the data. The present paper makes the following contributions in the context of sparse finite mixture modelling. First, it is illustrated that the concept of sparse finite mixture is very generic and easily extended to cluster various types of non-Gaussian data, in particular discrete data and continuous multivariate data arising from non-Gaussian clusters. Second, sparse finite mixtures are compared to Dirichlet process mixtures with respect to their ability to identify the number of clusters. For both model classes, a random hyper prior is considered for the parameters determining the weight distribution. By suitable matching of these priors, it is shown that the choice of this hyper prior is far more influential on the cluster solution than whether a sparse finite mixture or a Dirichlet process mixture is taken into consideration.
format Online
Article
Text
id pubmed-6448299
institution National Center for Biotechnology Information
language English
publishDate 2018
publisher Springer Berlin Heidelberg
record_format MEDLINE/PubMed
spelling pubmed-64482992019-04-17 From here to infinity: sparse finite versus Dirichlet process mixtures in model-based clustering Frühwirth-Schnatter, Sylvia Malsiner-Walli, Gertraud Adv Data Anal Classif Regular Article In model-based clustering mixture models are used to group data points into clusters. A useful concept introduced for Gaussian mixtures by Malsiner Walli et al. (Stat Comput 26:303–324, 2016) are sparse finite mixtures, where the prior distribution on the weight distribution of a mixture with K components is chosen in such a way that a priori the number of clusters in the data is random and is allowed to be smaller than K with high probability. The number of clusters is then inferred a posteriori from the data. The present paper makes the following contributions in the context of sparse finite mixture modelling. First, it is illustrated that the concept of sparse finite mixture is very generic and easily extended to cluster various types of non-Gaussian data, in particular discrete data and continuous multivariate data arising from non-Gaussian clusters. Second, sparse finite mixtures are compared to Dirichlet process mixtures with respect to their ability to identify the number of clusters. For both model classes, a random hyper prior is considered for the parameters determining the weight distribution. By suitable matching of these priors, it is shown that the choice of this hyper prior is far more influential on the cluster solution than whether a sparse finite mixture or a Dirichlet process mixture is taken into consideration. Springer Berlin Heidelberg 2018-08-24 2019 /pmc/articles/PMC6448299/ /pubmed/31007770 http://dx.doi.org/10.1007/s11634-018-0329-y Text en © The Author(s) 2018 Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
spellingShingle Regular Article
Frühwirth-Schnatter, Sylvia
Malsiner-Walli, Gertraud
From here to infinity: sparse finite versus Dirichlet process mixtures in model-based clustering
title From here to infinity: sparse finite versus Dirichlet process mixtures in model-based clustering
title_full From here to infinity: sparse finite versus Dirichlet process mixtures in model-based clustering
title_fullStr From here to infinity: sparse finite versus Dirichlet process mixtures in model-based clustering
title_full_unstemmed From here to infinity: sparse finite versus Dirichlet process mixtures in model-based clustering
title_short From here to infinity: sparse finite versus Dirichlet process mixtures in model-based clustering
title_sort from here to infinity: sparse finite versus dirichlet process mixtures in model-based clustering
topic Regular Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6448299/
https://www.ncbi.nlm.nih.gov/pubmed/31007770
http://dx.doi.org/10.1007/s11634-018-0329-y
work_keys_str_mv AT fruhwirthschnattersylvia fromheretoinfinitysparsefiniteversusdirichletprocessmixturesinmodelbasedclustering
AT malsinerwalligertraud fromheretoinfinitysparsefiniteversusdirichletprocessmixturesinmodelbasedclustering