Cargando…
Spatially-enhanced clusterwise inference for testing and localizing intermodal correspondence
With the increasing availability of neuroimaging data from multiple modalities—each providing a different lens through which to study brain structure or function—new techniques for comparing, integrating, and interpreting information within and across modalities have emerged. Recent developments inc...
Autores principales: | , , , , , , , , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10062374/ https://www.ncbi.nlm.nih.gov/pubmed/36309332 http://dx.doi.org/10.1016/j.neuroimage.2022.119712 |
_version_ | 1785017481525460992 |
---|---|
author | Weinstein, Sarah M. Vandekar, Simon N. Baller, Erica B. Tu, Danni Adebimpe, Azeez Tapera, Tinashe M. Gur, Ruben C. Gur, Raquel E. Detre, John A. Raznahan, Armin Alexander-Bloch, Aaron F. Satterthwaite, Theodore D. Shinohara, Russell T. Park, Jun Young |
author_facet | Weinstein, Sarah M. Vandekar, Simon N. Baller, Erica B. Tu, Danni Adebimpe, Azeez Tapera, Tinashe M. Gur, Ruben C. Gur, Raquel E. Detre, John A. Raznahan, Armin Alexander-Bloch, Aaron F. Satterthwaite, Theodore D. Shinohara, Russell T. Park, Jun Young |
author_sort | Weinstein, Sarah M. |
collection | PubMed |
description | With the increasing availability of neuroimaging data from multiple modalities—each providing a different lens through which to study brain structure or function—new techniques for comparing, integrating, and interpreting information within and across modalities have emerged. Recent developments include hypothesis tests of associations between neuroimaging modalities, which can be used to determine the statistical significance of intermodal associations either throughout the entire brain or within anatomical subregions or functional networks. While these methods provide a crucial foundation for inference on intermodal relationships, they cannot be used to answer questions about where in the brain these associations are most pronounced. In this paper, we introduce a new method, called CLEAN-R, that can be used both to test intermodal correspondence throughout the brain and also to localize this correspondence. Our method involves first adjusting for the underlying spatial autocorrelation structure within each modality before aggregating information within small clusters to construct a map of enhanced test statistics. Using structural and functional magnetic resonance imaging data from a subsample of children and adolescents from the Philadelphia Neurodevelopmental Cohort, we conduct simulations and data analyses where we illustrate the high statistical power and nominal type I error levels of our method. By constructing an interpretable map of group-level correspondence using spatially-enhanced test statistics, our method offers insights beyond those provided by earlier methods. |
format | Online Article Text |
id | pubmed-10062374 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
record_format | MEDLINE/PubMed |
spelling | pubmed-100623742023-03-30 Spatially-enhanced clusterwise inference for testing and localizing intermodal correspondence Weinstein, Sarah M. Vandekar, Simon N. Baller, Erica B. Tu, Danni Adebimpe, Azeez Tapera, Tinashe M. Gur, Ruben C. Gur, Raquel E. Detre, John A. Raznahan, Armin Alexander-Bloch, Aaron F. Satterthwaite, Theodore D. Shinohara, Russell T. Park, Jun Young Neuroimage Article With the increasing availability of neuroimaging data from multiple modalities—each providing a different lens through which to study brain structure or function—new techniques for comparing, integrating, and interpreting information within and across modalities have emerged. Recent developments include hypothesis tests of associations between neuroimaging modalities, which can be used to determine the statistical significance of intermodal associations either throughout the entire brain or within anatomical subregions or functional networks. While these methods provide a crucial foundation for inference on intermodal relationships, they cannot be used to answer questions about where in the brain these associations are most pronounced. In this paper, we introduce a new method, called CLEAN-R, that can be used both to test intermodal correspondence throughout the brain and also to localize this correspondence. Our method involves first adjusting for the underlying spatial autocorrelation structure within each modality before aggregating information within small clusters to construct a map of enhanced test statistics. Using structural and functional magnetic resonance imaging data from a subsample of children and adolescents from the Philadelphia Neurodevelopmental Cohort, we conduct simulations and data analyses where we illustrate the high statistical power and nominal type I error levels of our method. By constructing an interpretable map of group-level correspondence using spatially-enhanced test statistics, our method offers insights beyond those provided by earlier methods. 2022-12-01 2022-10-26 /pmc/articles/PMC10062374/ /pubmed/36309332 http://dx.doi.org/10.1016/j.neuroimage.2022.119712 Text en https://creativecommons.org/licenses/by-nc-nd/4.0/This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/ (https://creativecommons.org/licenses/by-nc-nd/4.0/) ) |
spellingShingle | Article Weinstein, Sarah M. Vandekar, Simon N. Baller, Erica B. Tu, Danni Adebimpe, Azeez Tapera, Tinashe M. Gur, Ruben C. Gur, Raquel E. Detre, John A. Raznahan, Armin Alexander-Bloch, Aaron F. Satterthwaite, Theodore D. Shinohara, Russell T. Park, Jun Young Spatially-enhanced clusterwise inference for testing and localizing intermodal correspondence |
title | Spatially-enhanced clusterwise inference for testing and localizing intermodal correspondence |
title_full | Spatially-enhanced clusterwise inference for testing and localizing intermodal correspondence |
title_fullStr | Spatially-enhanced clusterwise inference for testing and localizing intermodal correspondence |
title_full_unstemmed | Spatially-enhanced clusterwise inference for testing and localizing intermodal correspondence |
title_short | Spatially-enhanced clusterwise inference for testing and localizing intermodal correspondence |
title_sort | spatially-enhanced clusterwise inference for testing and localizing intermodal correspondence |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10062374/ https://www.ncbi.nlm.nih.gov/pubmed/36309332 http://dx.doi.org/10.1016/j.neuroimage.2022.119712 |
work_keys_str_mv | AT weinsteinsarahm spatiallyenhancedclusterwiseinferencefortestingandlocalizingintermodalcorrespondence AT vandekarsimonn spatiallyenhancedclusterwiseinferencefortestingandlocalizingintermodalcorrespondence AT ballerericab spatiallyenhancedclusterwiseinferencefortestingandlocalizingintermodalcorrespondence AT tudanni spatiallyenhancedclusterwiseinferencefortestingandlocalizingintermodalcorrespondence AT adebimpeazeez spatiallyenhancedclusterwiseinferencefortestingandlocalizingintermodalcorrespondence AT taperatinashem spatiallyenhancedclusterwiseinferencefortestingandlocalizingintermodalcorrespondence AT gurrubenc spatiallyenhancedclusterwiseinferencefortestingandlocalizingintermodalcorrespondence AT gurraquele spatiallyenhancedclusterwiseinferencefortestingandlocalizingintermodalcorrespondence AT detrejohna spatiallyenhancedclusterwiseinferencefortestingandlocalizingintermodalcorrespondence AT raznahanarmin spatiallyenhancedclusterwiseinferencefortestingandlocalizingintermodalcorrespondence AT alexanderblochaaronf spatiallyenhancedclusterwiseinferencefortestingandlocalizingintermodalcorrespondence AT satterthwaitetheodored spatiallyenhancedclusterwiseinferencefortestingandlocalizingintermodalcorrespondence AT shinohararussellt spatiallyenhancedclusterwiseinferencefortestingandlocalizingintermodalcorrespondence AT parkjunyoung spatiallyenhancedclusterwiseinferencefortestingandlocalizingintermodalcorrespondence |