Cargando…

Auto-encoded Latent Representations of White Matter Streamlines for Quantitative Distance Analysis

Parcellation of whole brain tractograms is a critical step to study brain white matter structures and connectivity patterns. The existing methods based on supervised classification of streamlines into predefined streamline bundle types are not designed to explore sub-bundle structures, and methods w...

Descripción completa

Detalles Bibliográficos
Autores principales: Zhong, Shenjun, Chen, Zhaolin, Egan, Gary
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Springer US 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9588484/
https://www.ncbi.nlm.nih.gov/pubmed/35731372
http://dx.doi.org/10.1007/s12021-022-09593-4
_version_ 1784814139184513024
author Zhong, Shenjun
Chen, Zhaolin
Egan, Gary
author_facet Zhong, Shenjun
Chen, Zhaolin
Egan, Gary
author_sort Zhong, Shenjun
collection PubMed
description Parcellation of whole brain tractograms is a critical step to study brain white matter structures and connectivity patterns. The existing methods based on supervised classification of streamlines into predefined streamline bundle types are not designed to explore sub-bundle structures, and methods with manually designed features are expensive to compute streamline-wise similarities. To resolve these issues, we propose a novel atlas-free method that learns a latent space using a deep recurrent auto-encoder trained in an unsupervised manner. The method efficiently embeds any length of streamlines to fixed-size feature vectors, named streamline embedding, for tractogram parcellation using non-parametric clustering in the latent space. The method was evaluated on the ISMRM 2015 tractography challenge dataset with discrimination of major bundles using clustering algorithms and streamline querying based on similarity, as well as real tractograms of 102 subjects Human Connectome Project. The learnt latent streamline and bundle representations open the possibility of quantitative studies of arbitrary granularity of sub-bundle structures using generic data mining techniques.
format Online
Article
Text
id pubmed-9588484
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Springer US
record_format MEDLINE/PubMed
spelling pubmed-95884842022-10-25 Auto-encoded Latent Representations of White Matter Streamlines for Quantitative Distance Analysis Zhong, Shenjun Chen, Zhaolin Egan, Gary Neuroinformatics Original Article Parcellation of whole brain tractograms is a critical step to study brain white matter structures and connectivity patterns. The existing methods based on supervised classification of streamlines into predefined streamline bundle types are not designed to explore sub-bundle structures, and methods with manually designed features are expensive to compute streamline-wise similarities. To resolve these issues, we propose a novel atlas-free method that learns a latent space using a deep recurrent auto-encoder trained in an unsupervised manner. The method efficiently embeds any length of streamlines to fixed-size feature vectors, named streamline embedding, for tractogram parcellation using non-parametric clustering in the latent space. The method was evaluated on the ISMRM 2015 tractography challenge dataset with discrimination of major bundles using clustering algorithms and streamline querying based on similarity, as well as real tractograms of 102 subjects Human Connectome Project. The learnt latent streamline and bundle representations open the possibility of quantitative studies of arbitrary granularity of sub-bundle structures using generic data mining techniques. Springer US 2022-06-22 2022 /pmc/articles/PMC9588484/ /pubmed/35731372 http://dx.doi.org/10.1007/s12021-022-09593-4 Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Original Article
Zhong, Shenjun
Chen, Zhaolin
Egan, Gary
Auto-encoded Latent Representations of White Matter Streamlines for Quantitative Distance Analysis
title Auto-encoded Latent Representations of White Matter Streamlines for Quantitative Distance Analysis
title_full Auto-encoded Latent Representations of White Matter Streamlines for Quantitative Distance Analysis
title_fullStr Auto-encoded Latent Representations of White Matter Streamlines for Quantitative Distance Analysis
title_full_unstemmed Auto-encoded Latent Representations of White Matter Streamlines for Quantitative Distance Analysis
title_short Auto-encoded Latent Representations of White Matter Streamlines for Quantitative Distance Analysis
title_sort auto-encoded latent representations of white matter streamlines for quantitative distance analysis
topic Original Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9588484/
https://www.ncbi.nlm.nih.gov/pubmed/35731372
http://dx.doi.org/10.1007/s12021-022-09593-4
work_keys_str_mv AT zhongshenjun autoencodedlatentrepresentationsofwhitematterstreamlinesforquantitativedistanceanalysis
AT chenzhaolin autoencodedlatentrepresentationsofwhitematterstreamlinesforquantitativedistanceanalysis
AT egangary autoencodedlatentrepresentationsofwhitematterstreamlinesforquantitativedistanceanalysis