Cargando…

Attention module improves both performance and interpretability of four‐dimensional functional magnetic resonance imaging decoding neural network

Decoding brain cognitive states from neuroimaging signals is an important topic in neuroscience. In recent years, deep neural networks (DNNs) have been recruited for multiple brain state decoding and achieved good performance. However, the open question of how to interpret the DNN black box remains...

Descripción completa

Detalles Bibliográficos
Autores principales: Jiang, Zhoufan, Wang, Yanming, Shi, ChenWei, Wu, Yueyang, Hu, Rongjie, Chen, Shishuo, Hu, Sheng, Wang, Xiaoxiao, Qiu, Bensheng
Formato: Online Artículo Texto
Lenguaje:English
Publicado: John Wiley & Sons, Inc. 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9057093/
https://www.ncbi.nlm.nih.gov/pubmed/35212436
http://dx.doi.org/10.1002/hbm.25813
_version_ 1784697817019711488
author Jiang, Zhoufan
Wang, Yanming
Shi, ChenWei
Wu, Yueyang
Hu, Rongjie
Chen, Shishuo
Hu, Sheng
Wang, Xiaoxiao
Qiu, Bensheng
author_facet Jiang, Zhoufan
Wang, Yanming
Shi, ChenWei
Wu, Yueyang
Hu, Rongjie
Chen, Shishuo
Hu, Sheng
Wang, Xiaoxiao
Qiu, Bensheng
author_sort Jiang, Zhoufan
collection PubMed
description Decoding brain cognitive states from neuroimaging signals is an important topic in neuroscience. In recent years, deep neural networks (DNNs) have been recruited for multiple brain state decoding and achieved good performance. However, the open question of how to interpret the DNN black box remains unanswered. Capitalizing on advances in machine learning, we integrated attention modules into brain decoders to facilitate an in‐depth interpretation of DNN channels. A four‐dimensional (4D) convolution operation was also included to extract temporo‐spatial interaction within the fMRI signal. The experiments showed that the proposed model obtains a very high accuracy (97.4%) and outperforms previous researches on the seven different task benchmarks from the Human Connectome Project (HCP) dataset. The visualization analysis further illustrated the hierarchical emergence of task‐specific masks with depth. Finally, the model was retrained to regress individual traits within the HCP and to classify viewing images from the BOLD5000 dataset, respectively. Transfer learning also achieves good performance. Further visualization analysis shows that, after transfer learning, low‐level attention masks remained similar to the source domain, whereas high‐level attention masks changed adaptively. In conclusion, the proposed 4D model with attention module performed well and facilitated interpretation of DNNs, which is helpful for subsequent research.
format Online
Article
Text
id pubmed-9057093
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher John Wiley & Sons, Inc.
record_format MEDLINE/PubMed
spelling pubmed-90570932022-05-03 Attention module improves both performance and interpretability of four‐dimensional functional magnetic resonance imaging decoding neural network Jiang, Zhoufan Wang, Yanming Shi, ChenWei Wu, Yueyang Hu, Rongjie Chen, Shishuo Hu, Sheng Wang, Xiaoxiao Qiu, Bensheng Hum Brain Mapp Research Articles Decoding brain cognitive states from neuroimaging signals is an important topic in neuroscience. In recent years, deep neural networks (DNNs) have been recruited for multiple brain state decoding and achieved good performance. However, the open question of how to interpret the DNN black box remains unanswered. Capitalizing on advances in machine learning, we integrated attention modules into brain decoders to facilitate an in‐depth interpretation of DNN channels. A four‐dimensional (4D) convolution operation was also included to extract temporo‐spatial interaction within the fMRI signal. The experiments showed that the proposed model obtains a very high accuracy (97.4%) and outperforms previous researches on the seven different task benchmarks from the Human Connectome Project (HCP) dataset. The visualization analysis further illustrated the hierarchical emergence of task‐specific masks with depth. Finally, the model was retrained to regress individual traits within the HCP and to classify viewing images from the BOLD5000 dataset, respectively. Transfer learning also achieves good performance. Further visualization analysis shows that, after transfer learning, low‐level attention masks remained similar to the source domain, whereas high‐level attention masks changed adaptively. In conclusion, the proposed 4D model with attention module performed well and facilitated interpretation of DNNs, which is helpful for subsequent research. John Wiley & Sons, Inc. 2022-02-25 /pmc/articles/PMC9057093/ /pubmed/35212436 http://dx.doi.org/10.1002/hbm.25813 Text en © 2022 The Authors. Human Brain Mapping published by Wiley Periodicals LLC. https://creativecommons.org/licenses/by/4.0/This is an open access article under the terms of the http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.
spellingShingle Research Articles
Jiang, Zhoufan
Wang, Yanming
Shi, ChenWei
Wu, Yueyang
Hu, Rongjie
Chen, Shishuo
Hu, Sheng
Wang, Xiaoxiao
Qiu, Bensheng
Attention module improves both performance and interpretability of four‐dimensional functional magnetic resonance imaging decoding neural network
title Attention module improves both performance and interpretability of four‐dimensional functional magnetic resonance imaging decoding neural network
title_full Attention module improves both performance and interpretability of four‐dimensional functional magnetic resonance imaging decoding neural network
title_fullStr Attention module improves both performance and interpretability of four‐dimensional functional magnetic resonance imaging decoding neural network
title_full_unstemmed Attention module improves both performance and interpretability of four‐dimensional functional magnetic resonance imaging decoding neural network
title_short Attention module improves both performance and interpretability of four‐dimensional functional magnetic resonance imaging decoding neural network
title_sort attention module improves both performance and interpretability of four‐dimensional functional magnetic resonance imaging decoding neural network
topic Research Articles
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9057093/
https://www.ncbi.nlm.nih.gov/pubmed/35212436
http://dx.doi.org/10.1002/hbm.25813
work_keys_str_mv AT jiangzhoufan attentionmoduleimprovesbothperformanceandinterpretabilityoffourdimensionalfunctionalmagneticresonanceimagingdecodingneuralnetwork
AT wangyanming attentionmoduleimprovesbothperformanceandinterpretabilityoffourdimensionalfunctionalmagneticresonanceimagingdecodingneuralnetwork
AT shichenwei attentionmoduleimprovesbothperformanceandinterpretabilityoffourdimensionalfunctionalmagneticresonanceimagingdecodingneuralnetwork
AT wuyueyang attentionmoduleimprovesbothperformanceandinterpretabilityoffourdimensionalfunctionalmagneticresonanceimagingdecodingneuralnetwork
AT hurongjie attentionmoduleimprovesbothperformanceandinterpretabilityoffourdimensionalfunctionalmagneticresonanceimagingdecodingneuralnetwork
AT chenshishuo attentionmoduleimprovesbothperformanceandinterpretabilityoffourdimensionalfunctionalmagneticresonanceimagingdecodingneuralnetwork
AT husheng attentionmoduleimprovesbothperformanceandinterpretabilityoffourdimensionalfunctionalmagneticresonanceimagingdecodingneuralnetwork
AT wangxiaoxiao attentionmoduleimprovesbothperformanceandinterpretabilityoffourdimensionalfunctionalmagneticresonanceimagingdecodingneuralnetwork
AT qiubensheng attentionmoduleimprovesbothperformanceandinterpretabilityoffourdimensionalfunctionalmagneticresonanceimagingdecodingneuralnetwork