Cargando…
Discovering Unknown Diseases with Explainable Automated Medical Imaging
Deep neural network (DNN) classifiers have attained remarkable performance in diagnosing known diseases when the models are trained on a large amount of data from known diseases. However, DNN classifiers trained on known diseases usually fail when they confront new diseases such as COVID-19. In this...
Autor principal: | |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7340943/ http://dx.doi.org/10.1007/978-3-030-52791-4_27 |
_version_ | 1783555128470536192 |
---|---|
author | Tang, Claire |
author_facet | Tang, Claire |
author_sort | Tang, Claire |
collection | PubMed |
description | Deep neural network (DNN) classifiers have attained remarkable performance in diagnosing known diseases when the models are trained on a large amount of data from known diseases. However, DNN classifiers trained on known diseases usually fail when they confront new diseases such as COVID-19. In this paper, we propose a new deep learning framework and pipeline for explainable medical imaging that can classify known diseases as well as detect new/unknown diseases when the models are only trained on known disease images. We first provide in-depth mathematical analysis to explain the overconfidence phenomena and present the calibrated confidence that can mitigate the overconfidence. Using calibrated confidence, we design a decision engine to determine if a medical image belongs to some known diseases or a new disease. At last, we introduce a new visual explanation to further reveal the suspected region inside each image. Using both Skin Lesion and Chest X-Ray datasets, we validate that our framework significantly improves the accuracy of new disease discovery, i.e., distinguish COVID-19 from pneumonia without seeing any COVID-19 data during training. We also qualitatively show that our visual explanations are highly consistent with doctors’ ground truth. While our work was not designed to target COVID-19, our experimental validation using the real world COVID-19 cases/data demonstrates the general applicability of our pipeline for different diseases based on medical imaging. |
format | Online Article Text |
id | pubmed-7340943 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2020 |
record_format | MEDLINE/PubMed |
spelling | pubmed-73409432020-07-08 Discovering Unknown Diseases with Explainable Automated Medical Imaging Tang, Claire Medical Image Understanding and Analysis Article Deep neural network (DNN) classifiers have attained remarkable performance in diagnosing known diseases when the models are trained on a large amount of data from known diseases. However, DNN classifiers trained on known diseases usually fail when they confront new diseases such as COVID-19. In this paper, we propose a new deep learning framework and pipeline for explainable medical imaging that can classify known diseases as well as detect new/unknown diseases when the models are only trained on known disease images. We first provide in-depth mathematical analysis to explain the overconfidence phenomena and present the calibrated confidence that can mitigate the overconfidence. Using calibrated confidence, we design a decision engine to determine if a medical image belongs to some known diseases or a new disease. At last, we introduce a new visual explanation to further reveal the suspected region inside each image. Using both Skin Lesion and Chest X-Ray datasets, we validate that our framework significantly improves the accuracy of new disease discovery, i.e., distinguish COVID-19 from pneumonia without seeing any COVID-19 data during training. We also qualitatively show that our visual explanations are highly consistent with doctors’ ground truth. While our work was not designed to target COVID-19, our experimental validation using the real world COVID-19 cases/data demonstrates the general applicability of our pipeline for different diseases based on medical imaging. 2020-06-09 /pmc/articles/PMC7340943/ http://dx.doi.org/10.1007/978-3-030-52791-4_27 Text en © Springer Nature Switzerland AG 2020 This article is made available via the PMC Open Access Subset for unrestricted research re-use and secondary analysis in any form or by any means with acknowledgement of the original source. These permissions are granted for the duration of the World Health Organization (WHO) declaration of COVID-19 as a global pandemic. |
spellingShingle | Article Tang, Claire Discovering Unknown Diseases with Explainable Automated Medical Imaging |
title | Discovering Unknown Diseases with Explainable Automated Medical Imaging |
title_full | Discovering Unknown Diseases with Explainable Automated Medical Imaging |
title_fullStr | Discovering Unknown Diseases with Explainable Automated Medical Imaging |
title_full_unstemmed | Discovering Unknown Diseases with Explainable Automated Medical Imaging |
title_short | Discovering Unknown Diseases with Explainable Automated Medical Imaging |
title_sort | discovering unknown diseases with explainable automated medical imaging |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7340943/ http://dx.doi.org/10.1007/978-3-030-52791-4_27 |
work_keys_str_mv | AT tangclaire discoveringunknowndiseaseswithexplainableautomatedmedicalimaging |