Cargando…

CTG-Net: Cross-task guided network for breast ultrasound diagnosis

Deep learning techniques have achieved remarkable success in lesion segmentation and classification between benign and malignant tumors in breast ultrasound images. However, existing studies are predominantly focused on devising efficient neural network-based learning structures to tackle specific t...

Descripción completa

Detalles Bibliográficos
Autores principales: Yang, Kaiwen, Suzuki, Aiga, Ye, Jiaxing, Nosato, Hirokazu, Izumori, Ayumi, Sakanashi, Hidenori
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9371312/
https://www.ncbi.nlm.nih.gov/pubmed/35951606
http://dx.doi.org/10.1371/journal.pone.0271106
_version_ 1784767102504140800
author Yang, Kaiwen
Suzuki, Aiga
Ye, Jiaxing
Nosato, Hirokazu
Izumori, Ayumi
Sakanashi, Hidenori
author_facet Yang, Kaiwen
Suzuki, Aiga
Ye, Jiaxing
Nosato, Hirokazu
Izumori, Ayumi
Sakanashi, Hidenori
author_sort Yang, Kaiwen
collection PubMed
description Deep learning techniques have achieved remarkable success in lesion segmentation and classification between benign and malignant tumors in breast ultrasound images. However, existing studies are predominantly focused on devising efficient neural network-based learning structures to tackle specific tasks individually. By contrast, in clinical practice, sonographers perform segmentation and classification as a whole; they investigate the border contours of the tissue while detecting abnormal masses and performing diagnostic analysis. Performing multiple cognitive tasks simultaneously in this manner facilitates exploitation of the commonalities and differences between tasks. Inspired by this unified recognition process, this study proposes a novel learning scheme, called the cross-task guided network (CTG-Net), for efficient ultrasound breast image understanding. CTG-Net integrates the two most significant tasks in computerized breast lesion pattern investigation: lesion segmentation and tumor classification. Further, it enables the learning of efficient feature representations across tasks from ultrasound images and the task-specific discriminative features that can greatly facilitate lesion detection. This is achieved using task-specific attention models to share the prediction results between tasks. Then, following the guidance of task-specific attention soft masks, the joint feature responses are efficiently calibrated through iterative model training. Finally, a simple feature fusion scheme is used to aggregate the attention-guided features for efficient ultrasound pattern analysis. We performed extensive experimental comparisons on multiple ultrasound datasets. Compared to state-of-the-art multi-task learning approaches, the proposed approach can improve the Dice’s coefficient, true-positive rate of segmentation, AUC, and sensitivity of classification by 11%, 17%, 2%, and 6%, respectively. The results demonstrate that the proposed cross-task guided feature learning framework can effectively fuse the complementary information of ultrasound image segmentation and classification tasks to achieve accurate tumor localization. Thus, it can aid sonographers to detect and diagnose breast cancer.
format Online
Article
Text
id pubmed-9371312
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-93713122022-08-12 CTG-Net: Cross-task guided network for breast ultrasound diagnosis Yang, Kaiwen Suzuki, Aiga Ye, Jiaxing Nosato, Hirokazu Izumori, Ayumi Sakanashi, Hidenori PLoS One Research Article Deep learning techniques have achieved remarkable success in lesion segmentation and classification between benign and malignant tumors in breast ultrasound images. However, existing studies are predominantly focused on devising efficient neural network-based learning structures to tackle specific tasks individually. By contrast, in clinical practice, sonographers perform segmentation and classification as a whole; they investigate the border contours of the tissue while detecting abnormal masses and performing diagnostic analysis. Performing multiple cognitive tasks simultaneously in this manner facilitates exploitation of the commonalities and differences between tasks. Inspired by this unified recognition process, this study proposes a novel learning scheme, called the cross-task guided network (CTG-Net), for efficient ultrasound breast image understanding. CTG-Net integrates the two most significant tasks in computerized breast lesion pattern investigation: lesion segmentation and tumor classification. Further, it enables the learning of efficient feature representations across tasks from ultrasound images and the task-specific discriminative features that can greatly facilitate lesion detection. This is achieved using task-specific attention models to share the prediction results between tasks. Then, following the guidance of task-specific attention soft masks, the joint feature responses are efficiently calibrated through iterative model training. Finally, a simple feature fusion scheme is used to aggregate the attention-guided features for efficient ultrasound pattern analysis. We performed extensive experimental comparisons on multiple ultrasound datasets. Compared to state-of-the-art multi-task learning approaches, the proposed approach can improve the Dice’s coefficient, true-positive rate of segmentation, AUC, and sensitivity of classification by 11%, 17%, 2%, and 6%, respectively. The results demonstrate that the proposed cross-task guided feature learning framework can effectively fuse the complementary information of ultrasound image segmentation and classification tasks to achieve accurate tumor localization. Thus, it can aid sonographers to detect and diagnose breast cancer. Public Library of Science 2022-08-11 /pmc/articles/PMC9371312/ /pubmed/35951606 http://dx.doi.org/10.1371/journal.pone.0271106 Text en © 2022 Yang et al https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
spellingShingle Research Article
Yang, Kaiwen
Suzuki, Aiga
Ye, Jiaxing
Nosato, Hirokazu
Izumori, Ayumi
Sakanashi, Hidenori
CTG-Net: Cross-task guided network for breast ultrasound diagnosis
title CTG-Net: Cross-task guided network for breast ultrasound diagnosis
title_full CTG-Net: Cross-task guided network for breast ultrasound diagnosis
title_fullStr CTG-Net: Cross-task guided network for breast ultrasound diagnosis
title_full_unstemmed CTG-Net: Cross-task guided network for breast ultrasound diagnosis
title_short CTG-Net: Cross-task guided network for breast ultrasound diagnosis
title_sort ctg-net: cross-task guided network for breast ultrasound diagnosis
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9371312/
https://www.ncbi.nlm.nih.gov/pubmed/35951606
http://dx.doi.org/10.1371/journal.pone.0271106
work_keys_str_mv AT yangkaiwen ctgnetcrosstaskguidednetworkforbreastultrasounddiagnosis
AT suzukiaiga ctgnetcrosstaskguidednetworkforbreastultrasounddiagnosis
AT yejiaxing ctgnetcrosstaskguidednetworkforbreastultrasounddiagnosis
AT nosatohirokazu ctgnetcrosstaskguidednetworkforbreastultrasounddiagnosis
AT izumoriayumi ctgnetcrosstaskguidednetworkforbreastultrasounddiagnosis
AT sakanashihidenori ctgnetcrosstaskguidednetworkforbreastultrasounddiagnosis