Cargando…

A Bayesian Alternative to Mutual Information for the Hierarchical Clustering of Dependent Random Variables

The use of mutual information as a similarity measure in agglomerative hierarchical clustering (AHC) raises an important issue: some correction needs to be applied for the dimensionality of variables. In this work, we formulate the decision of merging dependent multivariate normal variables in an AH...

Descripción completa

Detalles Bibliográficos
Autores principales: Marrelec, Guillaume, Messé, Arnaud, Bellec, Pierre
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2015
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4583305/
https://www.ncbi.nlm.nih.gov/pubmed/26406245
http://dx.doi.org/10.1371/journal.pone.0137278
_version_ 1782391830120235008
author Marrelec, Guillaume
Messé, Arnaud
Bellec, Pierre
author_facet Marrelec, Guillaume
Messé, Arnaud
Bellec, Pierre
author_sort Marrelec, Guillaume
collection PubMed
description The use of mutual information as a similarity measure in agglomerative hierarchical clustering (AHC) raises an important issue: some correction needs to be applied for the dimensionality of variables. In this work, we formulate the decision of merging dependent multivariate normal variables in an AHC procedure as a Bayesian model comparison. We found that the Bayesian formulation naturally shrinks the empirical covariance matrix towards a matrix set a priori (e.g., the identity), provides an automated stopping rule, and corrects for dimensionality using a term that scales up the measure as a function of the dimensionality of the variables. Also, the resulting log Bayes factor is asymptotically proportional to the plug-in estimate of mutual information, with an additive correction for dimensionality in agreement with the Bayesian information criterion. We investigated the behavior of these Bayesian alternatives (in exact and asymptotic forms) to mutual information on simulated and real data. An encouraging result was first derived on simulations: the hierarchical clustering based on the log Bayes factor outperformed off-the-shelf clustering techniques as well as raw and normalized mutual information in terms of classification accuracy. On a toy example, we found that the Bayesian approaches led to results that were similar to those of mutual information clustering techniques, with the advantage of an automated thresholding. On real functional magnetic resonance imaging (fMRI) datasets measuring brain activity, it identified clusters consistent with the established outcome of standard procedures. On this application, normalized mutual information had a highly atypical behavior, in the sense that it systematically favored very large clusters. These initial experiments suggest that the proposed Bayesian alternatives to mutual information are a useful new tool for hierarchical clustering.
format Online
Article
Text
id pubmed-4583305
institution National Center for Biotechnology Information
language English
publishDate 2015
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-45833052015-10-02 A Bayesian Alternative to Mutual Information for the Hierarchical Clustering of Dependent Random Variables Marrelec, Guillaume Messé, Arnaud Bellec, Pierre PLoS One Research Article The use of mutual information as a similarity measure in agglomerative hierarchical clustering (AHC) raises an important issue: some correction needs to be applied for the dimensionality of variables. In this work, we formulate the decision of merging dependent multivariate normal variables in an AHC procedure as a Bayesian model comparison. We found that the Bayesian formulation naturally shrinks the empirical covariance matrix towards a matrix set a priori (e.g., the identity), provides an automated stopping rule, and corrects for dimensionality using a term that scales up the measure as a function of the dimensionality of the variables. Also, the resulting log Bayes factor is asymptotically proportional to the plug-in estimate of mutual information, with an additive correction for dimensionality in agreement with the Bayesian information criterion. We investigated the behavior of these Bayesian alternatives (in exact and asymptotic forms) to mutual information on simulated and real data. An encouraging result was first derived on simulations: the hierarchical clustering based on the log Bayes factor outperformed off-the-shelf clustering techniques as well as raw and normalized mutual information in terms of classification accuracy. On a toy example, we found that the Bayesian approaches led to results that were similar to those of mutual information clustering techniques, with the advantage of an automated thresholding. On real functional magnetic resonance imaging (fMRI) datasets measuring brain activity, it identified clusters consistent with the established outcome of standard procedures. On this application, normalized mutual information had a highly atypical behavior, in the sense that it systematically favored very large clusters. These initial experiments suggest that the proposed Bayesian alternatives to mutual information are a useful new tool for hierarchical clustering. Public Library of Science 2015-09-25 /pmc/articles/PMC4583305/ /pubmed/26406245 http://dx.doi.org/10.1371/journal.pone.0137278 Text en © 2015 Marrelec et al http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are properly credited.
spellingShingle Research Article
Marrelec, Guillaume
Messé, Arnaud
Bellec, Pierre
A Bayesian Alternative to Mutual Information for the Hierarchical Clustering of Dependent Random Variables
title A Bayesian Alternative to Mutual Information for the Hierarchical Clustering of Dependent Random Variables
title_full A Bayesian Alternative to Mutual Information for the Hierarchical Clustering of Dependent Random Variables
title_fullStr A Bayesian Alternative to Mutual Information for the Hierarchical Clustering of Dependent Random Variables
title_full_unstemmed A Bayesian Alternative to Mutual Information for the Hierarchical Clustering of Dependent Random Variables
title_short A Bayesian Alternative to Mutual Information for the Hierarchical Clustering of Dependent Random Variables
title_sort bayesian alternative to mutual information for the hierarchical clustering of dependent random variables
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4583305/
https://www.ncbi.nlm.nih.gov/pubmed/26406245
http://dx.doi.org/10.1371/journal.pone.0137278
work_keys_str_mv AT marrelecguillaume abayesianalternativetomutualinformationforthehierarchicalclusteringofdependentrandomvariables
AT messearnaud abayesianalternativetomutualinformationforthehierarchicalclusteringofdependentrandomvariables
AT bellecpierre abayesianalternativetomutualinformationforthehierarchicalclusteringofdependentrandomvariables
AT marrelecguillaume bayesianalternativetomutualinformationforthehierarchicalclusteringofdependentrandomvariables
AT messearnaud bayesianalternativetomutualinformationforthehierarchicalclusteringofdependentrandomvariables
AT bellecpierre bayesianalternativetomutualinformationforthehierarchicalclusteringofdependentrandomvariables