Cargando…
Automatic model for cervical cancer screening based on convolutional neural network: a retrospective, multicohort, multicenter study
BACKGROUND: The incidence rates of cervical cancer in developing countries have been steeply increasing while the medical resources for prevention, detection, and treatment are still quite limited. Computer-based deep learning methods can achieve high-accuracy fast cancer screening. Such methods can...
Autores principales: | , , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
BioMed Central
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7791865/ https://www.ncbi.nlm.nih.gov/pubmed/33413391 http://dx.doi.org/10.1186/s12935-020-01742-6 |
_version_ | 1783633683687669760 |
---|---|
author | Tan, Xiangyu Li, Kexin Zhang, Jiucheng Wang, Wenzhe Wu, Bian Wu, Jian Li, Xiaoping Huang, Xiaoyuan |
author_facet | Tan, Xiangyu Li, Kexin Zhang, Jiucheng Wang, Wenzhe Wu, Bian Wu, Jian Li, Xiaoping Huang, Xiaoyuan |
author_sort | Tan, Xiangyu |
collection | PubMed |
description | BACKGROUND: The incidence rates of cervical cancer in developing countries have been steeply increasing while the medical resources for prevention, detection, and treatment are still quite limited. Computer-based deep learning methods can achieve high-accuracy fast cancer screening. Such methods can lead to early diagnosis, effective treatment, and hopefully successful prevention of cervical cancer. In this work, we seek to construct a robust deep convolutional neural network (DCNN) model that can assist pathologists in screening cervical cancer. METHODS: ThinPrep cytologic test (TCT) images diagnosed by pathologists from many collaborating hospitals in different regions were collected. The images were divided into a training dataset (13,775 images), validation dataset (2301 images), and test dataset (408,030 images from 290 scanned copies) for training and effect evaluation of a faster region convolutional neural network (Faster R-CNN) system. RESULTS: The sensitivity and specificity of the proposed cervical cancer screening system was 99.4 and 34.8%, respectively, with an area under the curve (AUC) of 0.67. The model could also distinguish between negative and positive cells. The sensitivity values of the atypical squamous cells of undetermined significance (ASCUS), the low-grade squamous intraepithelial lesion (LSIL), and the high-grade squamous intraepithelial lesions (HSIL) were 89.3, 71.5, and 73.9%, respectively. This system could quickly classify the images and generate a test report in about 3 minutes. Hence, the system can reduce the burden on the pathologists and saves them valuable time to analyze more complex cases. CONCLUSIONS: In our study, a CNN-based TCT cervical-cancer screening model was established through a retrospective study of multicenter TCT images. This model shows improved speed and accuracy for cervical cancer screening, and helps overcome the shortage of medical resources required for cervical cancer screening. |
format | Online Article Text |
id | pubmed-7791865 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | BioMed Central |
record_format | MEDLINE/PubMed |
spelling | pubmed-77918652021-01-11 Automatic model for cervical cancer screening based on convolutional neural network: a retrospective, multicohort, multicenter study Tan, Xiangyu Li, Kexin Zhang, Jiucheng Wang, Wenzhe Wu, Bian Wu, Jian Li, Xiaoping Huang, Xiaoyuan Cancer Cell Int Primary Research BACKGROUND: The incidence rates of cervical cancer in developing countries have been steeply increasing while the medical resources for prevention, detection, and treatment are still quite limited. Computer-based deep learning methods can achieve high-accuracy fast cancer screening. Such methods can lead to early diagnosis, effective treatment, and hopefully successful prevention of cervical cancer. In this work, we seek to construct a robust deep convolutional neural network (DCNN) model that can assist pathologists in screening cervical cancer. METHODS: ThinPrep cytologic test (TCT) images diagnosed by pathologists from many collaborating hospitals in different regions were collected. The images were divided into a training dataset (13,775 images), validation dataset (2301 images), and test dataset (408,030 images from 290 scanned copies) for training and effect evaluation of a faster region convolutional neural network (Faster R-CNN) system. RESULTS: The sensitivity and specificity of the proposed cervical cancer screening system was 99.4 and 34.8%, respectively, with an area under the curve (AUC) of 0.67. The model could also distinguish between negative and positive cells. The sensitivity values of the atypical squamous cells of undetermined significance (ASCUS), the low-grade squamous intraepithelial lesion (LSIL), and the high-grade squamous intraepithelial lesions (HSIL) were 89.3, 71.5, and 73.9%, respectively. This system could quickly classify the images and generate a test report in about 3 minutes. Hence, the system can reduce the burden on the pathologists and saves them valuable time to analyze more complex cases. CONCLUSIONS: In our study, a CNN-based TCT cervical-cancer screening model was established through a retrospective study of multicenter TCT images. This model shows improved speed and accuracy for cervical cancer screening, and helps overcome the shortage of medical resources required for cervical cancer screening. BioMed Central 2021-01-07 /pmc/articles/PMC7791865/ /pubmed/33413391 http://dx.doi.org/10.1186/s12935-020-01742-6 Text en © The Author(s) 2021 Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data. |
spellingShingle | Primary Research Tan, Xiangyu Li, Kexin Zhang, Jiucheng Wang, Wenzhe Wu, Bian Wu, Jian Li, Xiaoping Huang, Xiaoyuan Automatic model for cervical cancer screening based on convolutional neural network: a retrospective, multicohort, multicenter study |
title | Automatic model for cervical cancer screening based on convolutional neural network: a retrospective, multicohort, multicenter study |
title_full | Automatic model for cervical cancer screening based on convolutional neural network: a retrospective, multicohort, multicenter study |
title_fullStr | Automatic model for cervical cancer screening based on convolutional neural network: a retrospective, multicohort, multicenter study |
title_full_unstemmed | Automatic model for cervical cancer screening based on convolutional neural network: a retrospective, multicohort, multicenter study |
title_short | Automatic model for cervical cancer screening based on convolutional neural network: a retrospective, multicohort, multicenter study |
title_sort | automatic model for cervical cancer screening based on convolutional neural network: a retrospective, multicohort, multicenter study |
topic | Primary Research |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7791865/ https://www.ncbi.nlm.nih.gov/pubmed/33413391 http://dx.doi.org/10.1186/s12935-020-01742-6 |
work_keys_str_mv | AT tanxiangyu automaticmodelforcervicalcancerscreeningbasedonconvolutionalneuralnetworkaretrospectivemulticohortmulticenterstudy AT likexin automaticmodelforcervicalcancerscreeningbasedonconvolutionalneuralnetworkaretrospectivemulticohortmulticenterstudy AT zhangjiucheng automaticmodelforcervicalcancerscreeningbasedonconvolutionalneuralnetworkaretrospectivemulticohortmulticenterstudy AT wangwenzhe automaticmodelforcervicalcancerscreeningbasedonconvolutionalneuralnetworkaretrospectivemulticohortmulticenterstudy AT wubian automaticmodelforcervicalcancerscreeningbasedonconvolutionalneuralnetworkaretrospectivemulticohortmulticenterstudy AT wujian automaticmodelforcervicalcancerscreeningbasedonconvolutionalneuralnetworkaretrospectivemulticohortmulticenterstudy AT lixiaoping automaticmodelforcervicalcancerscreeningbasedonconvolutionalneuralnetworkaretrospectivemulticohortmulticenterstudy AT huangxiaoyuan automaticmodelforcervicalcancerscreeningbasedonconvolutionalneuralnetworkaretrospectivemulticohortmulticenterstudy |