Cargando…

Two-stage CNNs for computerized BI-RADS categorization in breast ultrasound images

BACKGROUND: Quantizing the Breast Imaging Reporting and Data System (BI-RADS) criteria into different categories with the single ultrasound modality has always been a challenge. To achieve this, we proposed a two-stage grading system to automatically evaluate breast tumors from ultrasound images int...

Descripción completa

Detalles Bibliográficos
Autores principales: Huang, Yunzhi, Han, Luyi, Dou, Haoran, Luo, Honghao, Yuan, Zhen, Liu, Qi, Zhang, Jiang, Yin, Guangfu
Formato: Online Artículo Texto
Lenguaje:English
Publicado: BioMed Central 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6346503/
https://www.ncbi.nlm.nih.gov/pubmed/30678680
http://dx.doi.org/10.1186/s12938-019-0626-5
_version_ 1783389764528898048
author Huang, Yunzhi
Han, Luyi
Dou, Haoran
Luo, Honghao
Yuan, Zhen
Liu, Qi
Zhang, Jiang
Yin, Guangfu
author_facet Huang, Yunzhi
Han, Luyi
Dou, Haoran
Luo, Honghao
Yuan, Zhen
Liu, Qi
Zhang, Jiang
Yin, Guangfu
author_sort Huang, Yunzhi
collection PubMed
description BACKGROUND: Quantizing the Breast Imaging Reporting and Data System (BI-RADS) criteria into different categories with the single ultrasound modality has always been a challenge. To achieve this, we proposed a two-stage grading system to automatically evaluate breast tumors from ultrasound images into five categories based on convolutional neural networks (CNNs). METHODS: This new developed automatic grading system was consisted of two stages, including the tumor identification and the tumor grading. The constructed network for tumor identification, denoted as ROI-CNN, can identify the region contained the tumor from the original breast ultrasound images. The following tumor categorization network, denoted as G-CNN, can generate effective features for differentiating the identified regions of interest (ROIs) into five categories: Category “3”, Category “4A”, Category “4B”, Category “4C”, and Category “5”. Particularly, to promote the predictions identified by the ROI-CNN better tailor to the tumor, refinement procedure based on Level-set was leveraged as a joint between the stage and grading stage. RESULTS: We tested the proposed two-stage grading system against 2238 cases with breast tumors in ultrasound images. With the accuracy as an indicator, our automatic computerized evaluation for grading breast tumors exhibited a performance comparable to that of subjective categories determined by physicians. Experimental results show that our two-stage framework can achieve the accuracy of 0.998 on Category “3”, 0.940 on Category “4A”, 0.734 on Category “4B”, 0.922 on Category “4C”, and 0.876 on Category “5”. CONCLUSION: The proposed scheme can extract effective features from the breast ultrasound images for the final classification of breast tumors by decoupling the identification features and classification features with different CNNs. Besides, the proposed scheme can extend the diagnosing of breast tumors in ultrasound images to five sub-categories according to BI-RADS rather than merely distinguishing the breast tumor malignant from benign.
format Online
Article
Text
id pubmed-6346503
institution National Center for Biotechnology Information
language English
publishDate 2019
publisher BioMed Central
record_format MEDLINE/PubMed
spelling pubmed-63465032019-01-29 Two-stage CNNs for computerized BI-RADS categorization in breast ultrasound images Huang, Yunzhi Han, Luyi Dou, Haoran Luo, Honghao Yuan, Zhen Liu, Qi Zhang, Jiang Yin, Guangfu Biomed Eng Online Research BACKGROUND: Quantizing the Breast Imaging Reporting and Data System (BI-RADS) criteria into different categories with the single ultrasound modality has always been a challenge. To achieve this, we proposed a two-stage grading system to automatically evaluate breast tumors from ultrasound images into five categories based on convolutional neural networks (CNNs). METHODS: This new developed automatic grading system was consisted of two stages, including the tumor identification and the tumor grading. The constructed network for tumor identification, denoted as ROI-CNN, can identify the region contained the tumor from the original breast ultrasound images. The following tumor categorization network, denoted as G-CNN, can generate effective features for differentiating the identified regions of interest (ROIs) into five categories: Category “3”, Category “4A”, Category “4B”, Category “4C”, and Category “5”. Particularly, to promote the predictions identified by the ROI-CNN better tailor to the tumor, refinement procedure based on Level-set was leveraged as a joint between the stage and grading stage. RESULTS: We tested the proposed two-stage grading system against 2238 cases with breast tumors in ultrasound images. With the accuracy as an indicator, our automatic computerized evaluation for grading breast tumors exhibited a performance comparable to that of subjective categories determined by physicians. Experimental results show that our two-stage framework can achieve the accuracy of 0.998 on Category “3”, 0.940 on Category “4A”, 0.734 on Category “4B”, 0.922 on Category “4C”, and 0.876 on Category “5”. CONCLUSION: The proposed scheme can extract effective features from the breast ultrasound images for the final classification of breast tumors by decoupling the identification features and classification features with different CNNs. Besides, the proposed scheme can extend the diagnosing of breast tumors in ultrasound images to five sub-categories according to BI-RADS rather than merely distinguishing the breast tumor malignant from benign. BioMed Central 2019-01-24 /pmc/articles/PMC6346503/ /pubmed/30678680 http://dx.doi.org/10.1186/s12938-019-0626-5 Text en © The Author(s) 2019 Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.
spellingShingle Research
Huang, Yunzhi
Han, Luyi
Dou, Haoran
Luo, Honghao
Yuan, Zhen
Liu, Qi
Zhang, Jiang
Yin, Guangfu
Two-stage CNNs for computerized BI-RADS categorization in breast ultrasound images
title Two-stage CNNs for computerized BI-RADS categorization in breast ultrasound images
title_full Two-stage CNNs for computerized BI-RADS categorization in breast ultrasound images
title_fullStr Two-stage CNNs for computerized BI-RADS categorization in breast ultrasound images
title_full_unstemmed Two-stage CNNs for computerized BI-RADS categorization in breast ultrasound images
title_short Two-stage CNNs for computerized BI-RADS categorization in breast ultrasound images
title_sort two-stage cnns for computerized bi-rads categorization in breast ultrasound images
topic Research
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6346503/
https://www.ncbi.nlm.nih.gov/pubmed/30678680
http://dx.doi.org/10.1186/s12938-019-0626-5
work_keys_str_mv AT huangyunzhi twostagecnnsforcomputerizedbiradscategorizationinbreastultrasoundimages
AT hanluyi twostagecnnsforcomputerizedbiradscategorizationinbreastultrasoundimages
AT douhaoran twostagecnnsforcomputerizedbiradscategorizationinbreastultrasoundimages
AT luohonghao twostagecnnsforcomputerizedbiradscategorizationinbreastultrasoundimages
AT yuanzhen twostagecnnsforcomputerizedbiradscategorizationinbreastultrasoundimages
AT liuqi twostagecnnsforcomputerizedbiradscategorizationinbreastultrasoundimages
AT zhangjiang twostagecnnsforcomputerizedbiradscategorizationinbreastultrasoundimages
AT yinguangfu twostagecnnsforcomputerizedbiradscategorizationinbreastultrasoundimages