Cargando…

An adaptive multi-modal hybrid model for classifying thyroid nodules by combining ultrasound and infrared thermal images

BACKGROUND: Two types of non-invasive, radiation-free, and inexpensive imaging technologies that are widely employed in medical applications are ultrasound (US) and infrared thermography (IRT). The ultrasound image obtained by ultrasound imaging primarily expresses the size, shape, contour boundary,...

Descripción completa

Detalles Bibliográficos
Autores principales: Zhang, Na, Liu, Juan, Jin, Yu, Duan, Wensi, Wu, Ziling, Cai, Zhaohui, Wu, Meng
Formato: Online Artículo Texto
Lenguaje:English
Publicado: BioMed Central 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10440038/
https://www.ncbi.nlm.nih.gov/pubmed/37598159
http://dx.doi.org/10.1186/s12859-023-05446-2
_version_ 1785093088347160576
author Zhang, Na
Liu, Juan
Jin, Yu
Duan, Wensi
Wu, Ziling
Cai, Zhaohui
Wu, Meng
author_facet Zhang, Na
Liu, Juan
Jin, Yu
Duan, Wensi
Wu, Ziling
Cai, Zhaohui
Wu, Meng
author_sort Zhang, Na
collection PubMed
description BACKGROUND: Two types of non-invasive, radiation-free, and inexpensive imaging technologies that are widely employed in medical applications are ultrasound (US) and infrared thermography (IRT). The ultrasound image obtained by ultrasound imaging primarily expresses the size, shape, contour boundary, echo, and other morphological information of the lesion, while the infrared thermal image obtained by infrared thermography imaging primarily describes its thermodynamic function information. Although distinguishing between benign and malignant thyroid nodules requires both morphological and functional information, present deep learning models are only based on US images, making it possible that some malignant nodules with insignificant morphological changes but significant functional changes will go undetected. RESULTS: Given the US and IRT images present thyroid nodules through distinct modalities, we proposed an Adaptive multi-modal Hybrid (AmmH) classification model that can leverage the amalgamation of these two image types to achieve superior classification performance. The AmmH approach involves the construction of a hybrid single-modal encoder module for each modal data, which facilitates the extraction of both local and global features by integrating a CNN module and a Transformer module. The extracted features from the two modalities are then weighted adaptively using an adaptive modality-weight generation network and fused using an adaptive cross-modal encoder module. The fused features are subsequently utilized for the classification of thyroid nodules through the use of MLP. On the collected dataset, our AmmH model respectively achieved 97.17% and 97.38% of F1 and F2 scores, which significantly outperformed the single-modal models. The results of four ablation experiments further show the superiority of our proposed method. CONCLUSIONS: The proposed multi-modal model extracts features from various modal images, thereby enhancing the comprehensiveness of thyroid nodules descriptions. The adaptive modality-weight generation network enables adaptive attention to different modalities, facilitating the fusion of features using adaptive weights through the adaptive cross-modal encoder. Consequently, the model has demonstrated promising classification performance, indicating its potential as a non-invasive, radiation-free, and cost-effective screening tool for distinguishing between benign and malignant thyroid nodules. The source code is available at https://github.com/wuliZN2020/AmmH.
format Online
Article
Text
id pubmed-10440038
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher BioMed Central
record_format MEDLINE/PubMed
spelling pubmed-104400382023-08-21 An adaptive multi-modal hybrid model for classifying thyroid nodules by combining ultrasound and infrared thermal images Zhang, Na Liu, Juan Jin, Yu Duan, Wensi Wu, Ziling Cai, Zhaohui Wu, Meng BMC Bioinformatics Research BACKGROUND: Two types of non-invasive, radiation-free, and inexpensive imaging technologies that are widely employed in medical applications are ultrasound (US) and infrared thermography (IRT). The ultrasound image obtained by ultrasound imaging primarily expresses the size, shape, contour boundary, echo, and other morphological information of the lesion, while the infrared thermal image obtained by infrared thermography imaging primarily describes its thermodynamic function information. Although distinguishing between benign and malignant thyroid nodules requires both morphological and functional information, present deep learning models are only based on US images, making it possible that some malignant nodules with insignificant morphological changes but significant functional changes will go undetected. RESULTS: Given the US and IRT images present thyroid nodules through distinct modalities, we proposed an Adaptive multi-modal Hybrid (AmmH) classification model that can leverage the amalgamation of these two image types to achieve superior classification performance. The AmmH approach involves the construction of a hybrid single-modal encoder module for each modal data, which facilitates the extraction of both local and global features by integrating a CNN module and a Transformer module. The extracted features from the two modalities are then weighted adaptively using an adaptive modality-weight generation network and fused using an adaptive cross-modal encoder module. The fused features are subsequently utilized for the classification of thyroid nodules through the use of MLP. On the collected dataset, our AmmH model respectively achieved 97.17% and 97.38% of F1 and F2 scores, which significantly outperformed the single-modal models. The results of four ablation experiments further show the superiority of our proposed method. CONCLUSIONS: The proposed multi-modal model extracts features from various modal images, thereby enhancing the comprehensiveness of thyroid nodules descriptions. The adaptive modality-weight generation network enables adaptive attention to different modalities, facilitating the fusion of features using adaptive weights through the adaptive cross-modal encoder. Consequently, the model has demonstrated promising classification performance, indicating its potential as a non-invasive, radiation-free, and cost-effective screening tool for distinguishing between benign and malignant thyroid nodules. The source code is available at https://github.com/wuliZN2020/AmmH. BioMed Central 2023-08-19 /pmc/articles/PMC10440038/ /pubmed/37598159 http://dx.doi.org/10.1186/s12859-023-05446-2 Text en © The Author(s) 2023 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/ (https://creativecommons.org/publicdomain/zero/1.0/) ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.
spellingShingle Research
Zhang, Na
Liu, Juan
Jin, Yu
Duan, Wensi
Wu, Ziling
Cai, Zhaohui
Wu, Meng
An adaptive multi-modal hybrid model for classifying thyroid nodules by combining ultrasound and infrared thermal images
title An adaptive multi-modal hybrid model for classifying thyroid nodules by combining ultrasound and infrared thermal images
title_full An adaptive multi-modal hybrid model for classifying thyroid nodules by combining ultrasound and infrared thermal images
title_fullStr An adaptive multi-modal hybrid model for classifying thyroid nodules by combining ultrasound and infrared thermal images
title_full_unstemmed An adaptive multi-modal hybrid model for classifying thyroid nodules by combining ultrasound and infrared thermal images
title_short An adaptive multi-modal hybrid model for classifying thyroid nodules by combining ultrasound and infrared thermal images
title_sort adaptive multi-modal hybrid model for classifying thyroid nodules by combining ultrasound and infrared thermal images
topic Research
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10440038/
https://www.ncbi.nlm.nih.gov/pubmed/37598159
http://dx.doi.org/10.1186/s12859-023-05446-2
work_keys_str_mv AT zhangna anadaptivemultimodalhybridmodelforclassifyingthyroidnodulesbycombiningultrasoundandinfraredthermalimages
AT liujuan anadaptivemultimodalhybridmodelforclassifyingthyroidnodulesbycombiningultrasoundandinfraredthermalimages
AT jinyu anadaptivemultimodalhybridmodelforclassifyingthyroidnodulesbycombiningultrasoundandinfraredthermalimages
AT duanwensi anadaptivemultimodalhybridmodelforclassifyingthyroidnodulesbycombiningultrasoundandinfraredthermalimages
AT wuziling anadaptivemultimodalhybridmodelforclassifyingthyroidnodulesbycombiningultrasoundandinfraredthermalimages
AT caizhaohui anadaptivemultimodalhybridmodelforclassifyingthyroidnodulesbycombiningultrasoundandinfraredthermalimages
AT wumeng anadaptivemultimodalhybridmodelforclassifyingthyroidnodulesbycombiningultrasoundandinfraredthermalimages
AT zhangna adaptivemultimodalhybridmodelforclassifyingthyroidnodulesbycombiningultrasoundandinfraredthermalimages
AT liujuan adaptivemultimodalhybridmodelforclassifyingthyroidnodulesbycombiningultrasoundandinfraredthermalimages
AT jinyu adaptivemultimodalhybridmodelforclassifyingthyroidnodulesbycombiningultrasoundandinfraredthermalimages
AT duanwensi adaptivemultimodalhybridmodelforclassifyingthyroidnodulesbycombiningultrasoundandinfraredthermalimages
AT wuziling adaptivemultimodalhybridmodelforclassifyingthyroidnodulesbycombiningultrasoundandinfraredthermalimages
AT caizhaohui adaptivemultimodalhybridmodelforclassifyingthyroidnodulesbycombiningultrasoundandinfraredthermalimages
AT wumeng adaptivemultimodalhybridmodelforclassifyingthyroidnodulesbycombiningultrasoundandinfraredthermalimages