Cargando…

Multi-head attention-based masked sequence model for mapping functional brain networks

The investigation of functional brain networks (FBNs) using task-based functional magnetic resonance imaging (tfMRI) has gained significant attention in the field of neuroimaging. Despite the availability of several methods for constructing FBNs, including traditional methods like GLM and deep learn...

Descripción completa

Detalles Bibliográficos
Autores principales: He, Mengshen, Hou, Xiangyu, Ge, Enjie, Wang, Zhenwei, Kang, Zili, Qiang, Ning, Zhang, Xin, Ge, Bao
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10192686/
https://www.ncbi.nlm.nih.gov/pubmed/37214388
http://dx.doi.org/10.3389/fnins.2023.1183145
_version_ 1785043677859545088
author He, Mengshen
Hou, Xiangyu
Ge, Enjie
Wang, Zhenwei
Kang, Zili
Qiang, Ning
Zhang, Xin
Ge, Bao
author_facet He, Mengshen
Hou, Xiangyu
Ge, Enjie
Wang, Zhenwei
Kang, Zili
Qiang, Ning
Zhang, Xin
Ge, Bao
author_sort He, Mengshen
collection PubMed
description The investigation of functional brain networks (FBNs) using task-based functional magnetic resonance imaging (tfMRI) has gained significant attention in the field of neuroimaging. Despite the availability of several methods for constructing FBNs, including traditional methods like GLM and deep learning methods such as spatiotemporal self-attention mechanism (STAAE), these methods have design and training limitations. Specifically, they do not consider the intrinsic characteristics of fMRI data, such as the possibility that the same signal value at different time points could represent different brain states and meanings. Furthermore, they overlook prior knowledge, such as task designs, during training. This study aims to overcome these limitations and develop a more efficient model by drawing inspiration from techniques in the field of natural language processing (NLP). The proposed model, called the Multi-head Attention-based Masked Sequence Model (MAMSM), uses a multi-headed attention mechanism and mask training approach to learn different states corresponding to the same voxel values. Additionally, it combines cosine similarity and task design curves to construct a novel loss function. The MAMSM was applied to seven task state datasets from the Human Connectome Project (HCP) tfMRI dataset. Experimental results showed that the features acquired by the MAMSM model exhibit a Pearson correlation coefficient with the task design curves above 0.95 on average. Moreover, the model can extract more meaningful networks beyond the known task-related brain networks. The experimental results demonstrated that MAMSM has great potential in advancing the understanding of functional brain networks.
format Online
Article
Text
id pubmed-10192686
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-101926862023-05-19 Multi-head attention-based masked sequence model for mapping functional brain networks He, Mengshen Hou, Xiangyu Ge, Enjie Wang, Zhenwei Kang, Zili Qiang, Ning Zhang, Xin Ge, Bao Front Neurosci Neuroscience The investigation of functional brain networks (FBNs) using task-based functional magnetic resonance imaging (tfMRI) has gained significant attention in the field of neuroimaging. Despite the availability of several methods for constructing FBNs, including traditional methods like GLM and deep learning methods such as spatiotemporal self-attention mechanism (STAAE), these methods have design and training limitations. Specifically, they do not consider the intrinsic characteristics of fMRI data, such as the possibility that the same signal value at different time points could represent different brain states and meanings. Furthermore, they overlook prior knowledge, such as task designs, during training. This study aims to overcome these limitations and develop a more efficient model by drawing inspiration from techniques in the field of natural language processing (NLP). The proposed model, called the Multi-head Attention-based Masked Sequence Model (MAMSM), uses a multi-headed attention mechanism and mask training approach to learn different states corresponding to the same voxel values. Additionally, it combines cosine similarity and task design curves to construct a novel loss function. The MAMSM was applied to seven task state datasets from the Human Connectome Project (HCP) tfMRI dataset. Experimental results showed that the features acquired by the MAMSM model exhibit a Pearson correlation coefficient with the task design curves above 0.95 on average. Moreover, the model can extract more meaningful networks beyond the known task-related brain networks. The experimental results demonstrated that MAMSM has great potential in advancing the understanding of functional brain networks. Frontiers Media S.A. 2023-05-04 /pmc/articles/PMC10192686/ /pubmed/37214388 http://dx.doi.org/10.3389/fnins.2023.1183145 Text en Copyright © 2023 He, Hou, Ge, Wang, Kang, Qiang, Zhang and Ge. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Neuroscience
He, Mengshen
Hou, Xiangyu
Ge, Enjie
Wang, Zhenwei
Kang, Zili
Qiang, Ning
Zhang, Xin
Ge, Bao
Multi-head attention-based masked sequence model for mapping functional brain networks
title Multi-head attention-based masked sequence model for mapping functional brain networks
title_full Multi-head attention-based masked sequence model for mapping functional brain networks
title_fullStr Multi-head attention-based masked sequence model for mapping functional brain networks
title_full_unstemmed Multi-head attention-based masked sequence model for mapping functional brain networks
title_short Multi-head attention-based masked sequence model for mapping functional brain networks
title_sort multi-head attention-based masked sequence model for mapping functional brain networks
topic Neuroscience
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10192686/
https://www.ncbi.nlm.nih.gov/pubmed/37214388
http://dx.doi.org/10.3389/fnins.2023.1183145
work_keys_str_mv AT hemengshen multiheadattentionbasedmaskedsequencemodelformappingfunctionalbrainnetworks
AT houxiangyu multiheadattentionbasedmaskedsequencemodelformappingfunctionalbrainnetworks
AT geenjie multiheadattentionbasedmaskedsequencemodelformappingfunctionalbrainnetworks
AT wangzhenwei multiheadattentionbasedmaskedsequencemodelformappingfunctionalbrainnetworks
AT kangzili multiheadattentionbasedmaskedsequencemodelformappingfunctionalbrainnetworks
AT qiangning multiheadattentionbasedmaskedsequencemodelformappingfunctionalbrainnetworks
AT zhangxin multiheadattentionbasedmaskedsequencemodelformappingfunctionalbrainnetworks
AT gebao multiheadattentionbasedmaskedsequencemodelformappingfunctionalbrainnetworks