Cargando…

Computer-aided diagnosis of chest X-ray for COVID-19 diagnosis in external validation study by radiologists with and without deep learning system

To evaluate the diagnostic performance of our deep learning (DL) model of COVID-19 and investigate whether the diagnostic performance of radiologists was improved by referring to our model. Our datasets contained chest X-rays (CXRs) for the following three categories: normal (NORMAL), non-COVID-19 p...

Descripción completa

Detalles Bibliográficos
Autores principales: Miyazaki, Aki, Ikejima, Kengo, Nishio, Mizuho, Yabuta, Minoru, Matsuo, Hidetoshi, Onoue, Koji, Matsunaga, Takaaki, Nishioka, Eiko, Kono, Atsushi, Yamada, Daisuke, Oba, Ken, Ishikura, Reiichi, Murakami, Takamichi
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10579343/
https://www.ncbi.nlm.nih.gov/pubmed/37845348
http://dx.doi.org/10.1038/s41598-023-44818-9
_version_ 1785121705066233856
author Miyazaki, Aki
Ikejima, Kengo
Nishio, Mizuho
Yabuta, Minoru
Matsuo, Hidetoshi
Onoue, Koji
Matsunaga, Takaaki
Nishioka, Eiko
Kono, Atsushi
Yamada, Daisuke
Oba, Ken
Ishikura, Reiichi
Murakami, Takamichi
author_facet Miyazaki, Aki
Ikejima, Kengo
Nishio, Mizuho
Yabuta, Minoru
Matsuo, Hidetoshi
Onoue, Koji
Matsunaga, Takaaki
Nishioka, Eiko
Kono, Atsushi
Yamada, Daisuke
Oba, Ken
Ishikura, Reiichi
Murakami, Takamichi
author_sort Miyazaki, Aki
collection PubMed
description To evaluate the diagnostic performance of our deep learning (DL) model of COVID-19 and investigate whether the diagnostic performance of radiologists was improved by referring to our model. Our datasets contained chest X-rays (CXRs) for the following three categories: normal (NORMAL), non-COVID-19 pneumonia (PNEUMONIA), and COVID-19 pneumonia (COVID). We used two public datasets and private dataset collected from eight hospitals for the development and external validation of our DL model (26,393 CXRs). Eight radiologists performed two reading sessions: one session was performed with reference to CXRs only, and the other was performed with reference to both CXRs and the results of the DL model. The evaluation metrics for the reading session were accuracy, sensitivity, specificity, and area under the curve (AUC). The accuracy of our DL model was 0.733, and that of the eight radiologists without DL was 0.696 ± 0.031. There was a significant difference in AUC between the radiologists with and without DL for COVID versus NORMAL or PNEUMONIA (p = 0.0038). Our DL model alone showed better diagnostic performance than that of most radiologists. In addition, our model significantly improved the diagnostic performance of radiologists for COVID versus NORMAL or PNEUMONIA.
format Online
Article
Text
id pubmed-10579343
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-105793432023-10-18 Computer-aided diagnosis of chest X-ray for COVID-19 diagnosis in external validation study by radiologists with and without deep learning system Miyazaki, Aki Ikejima, Kengo Nishio, Mizuho Yabuta, Minoru Matsuo, Hidetoshi Onoue, Koji Matsunaga, Takaaki Nishioka, Eiko Kono, Atsushi Yamada, Daisuke Oba, Ken Ishikura, Reiichi Murakami, Takamichi Sci Rep Article To evaluate the diagnostic performance of our deep learning (DL) model of COVID-19 and investigate whether the diagnostic performance of radiologists was improved by referring to our model. Our datasets contained chest X-rays (CXRs) for the following three categories: normal (NORMAL), non-COVID-19 pneumonia (PNEUMONIA), and COVID-19 pneumonia (COVID). We used two public datasets and private dataset collected from eight hospitals for the development and external validation of our DL model (26,393 CXRs). Eight radiologists performed two reading sessions: one session was performed with reference to CXRs only, and the other was performed with reference to both CXRs and the results of the DL model. The evaluation metrics for the reading session were accuracy, sensitivity, specificity, and area under the curve (AUC). The accuracy of our DL model was 0.733, and that of the eight radiologists without DL was 0.696 ± 0.031. There was a significant difference in AUC between the radiologists with and without DL for COVID versus NORMAL or PNEUMONIA (p = 0.0038). Our DL model alone showed better diagnostic performance than that of most radiologists. In addition, our model significantly improved the diagnostic performance of radiologists for COVID versus NORMAL or PNEUMONIA. Nature Publishing Group UK 2023-10-16 /pmc/articles/PMC10579343/ /pubmed/37845348 http://dx.doi.org/10.1038/s41598-023-44818-9 Text en © The Author(s) 2023 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Miyazaki, Aki
Ikejima, Kengo
Nishio, Mizuho
Yabuta, Minoru
Matsuo, Hidetoshi
Onoue, Koji
Matsunaga, Takaaki
Nishioka, Eiko
Kono, Atsushi
Yamada, Daisuke
Oba, Ken
Ishikura, Reiichi
Murakami, Takamichi
Computer-aided diagnosis of chest X-ray for COVID-19 diagnosis in external validation study by radiologists with and without deep learning system
title Computer-aided diagnosis of chest X-ray for COVID-19 diagnosis in external validation study by radiologists with and without deep learning system
title_full Computer-aided diagnosis of chest X-ray for COVID-19 diagnosis in external validation study by radiologists with and without deep learning system
title_fullStr Computer-aided diagnosis of chest X-ray for COVID-19 diagnosis in external validation study by radiologists with and without deep learning system
title_full_unstemmed Computer-aided diagnosis of chest X-ray for COVID-19 diagnosis in external validation study by radiologists with and without deep learning system
title_short Computer-aided diagnosis of chest X-ray for COVID-19 diagnosis in external validation study by radiologists with and without deep learning system
title_sort computer-aided diagnosis of chest x-ray for covid-19 diagnosis in external validation study by radiologists with and without deep learning system
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10579343/
https://www.ncbi.nlm.nih.gov/pubmed/37845348
http://dx.doi.org/10.1038/s41598-023-44818-9
work_keys_str_mv AT miyazakiaki computeraideddiagnosisofchestxrayforcovid19diagnosisinexternalvalidationstudybyradiologistswithandwithoutdeeplearningsystem
AT ikejimakengo computeraideddiagnosisofchestxrayforcovid19diagnosisinexternalvalidationstudybyradiologistswithandwithoutdeeplearningsystem
AT nishiomizuho computeraideddiagnosisofchestxrayforcovid19diagnosisinexternalvalidationstudybyradiologistswithandwithoutdeeplearningsystem
AT yabutaminoru computeraideddiagnosisofchestxrayforcovid19diagnosisinexternalvalidationstudybyradiologistswithandwithoutdeeplearningsystem
AT matsuohidetoshi computeraideddiagnosisofchestxrayforcovid19diagnosisinexternalvalidationstudybyradiologistswithandwithoutdeeplearningsystem
AT onouekoji computeraideddiagnosisofchestxrayforcovid19diagnosisinexternalvalidationstudybyradiologistswithandwithoutdeeplearningsystem
AT matsunagatakaaki computeraideddiagnosisofchestxrayforcovid19diagnosisinexternalvalidationstudybyradiologistswithandwithoutdeeplearningsystem
AT nishiokaeiko computeraideddiagnosisofchestxrayforcovid19diagnosisinexternalvalidationstudybyradiologistswithandwithoutdeeplearningsystem
AT konoatsushi computeraideddiagnosisofchestxrayforcovid19diagnosisinexternalvalidationstudybyradiologistswithandwithoutdeeplearningsystem
AT yamadadaisuke computeraideddiagnosisofchestxrayforcovid19diagnosisinexternalvalidationstudybyradiologistswithandwithoutdeeplearningsystem
AT obaken computeraideddiagnosisofchestxrayforcovid19diagnosisinexternalvalidationstudybyradiologistswithandwithoutdeeplearningsystem
AT ishikurareiichi computeraideddiagnosisofchestxrayforcovid19diagnosisinexternalvalidationstudybyradiologistswithandwithoutdeeplearningsystem
AT murakamitakamichi computeraideddiagnosisofchestxrayforcovid19diagnosisinexternalvalidationstudybyradiologistswithandwithoutdeeplearningsystem