Cargando…
Leveraging uncertainty information from deep neural networks for disease detection
Deep learning (DL) has revolutionized the field of computer vision and image processing. In medical imaging, algorithmic solutions based on DL have been shown to achieve high performance on tasks that previously required medical experts. However, DL-based solutions for disease detection have been pr...
Autores principales: | , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2017
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5736701/ https://www.ncbi.nlm.nih.gov/pubmed/29259224 http://dx.doi.org/10.1038/s41598-017-17876-z |
_version_ | 1783287410620104704 |
---|---|
author | Leibig, Christian Allken, Vaneeda Ayhan, Murat Seçkin Berens, Philipp Wahl, Siegfried |
author_facet | Leibig, Christian Allken, Vaneeda Ayhan, Murat Seçkin Berens, Philipp Wahl, Siegfried |
author_sort | Leibig, Christian |
collection | PubMed |
description | Deep learning (DL) has revolutionized the field of computer vision and image processing. In medical imaging, algorithmic solutions based on DL have been shown to achieve high performance on tasks that previously required medical experts. However, DL-based solutions for disease detection have been proposed without methods to quantify and control their uncertainty in a decision. In contrast, a physician knows whether she is uncertain about a case and will consult more experienced colleagues if needed. Here we evaluate drop-out based Bayesian uncertainty measures for DL in diagnosing diabetic retinopathy (DR) from fundus images and show that it captures uncertainty better than straightforward alternatives. Furthermore, we show that uncertainty informed decision referral can improve diagnostic performance. Experiments across different networks, tasks and datasets show robust generalization. Depending on network capacity and task/dataset difficulty, we surpass 85% sensitivity and 80% specificity as recommended by the NHS when referring 0−20% of the most uncertain decisions for further inspection. We analyse causes of uncertainty by relating intuitions from 2D visualizations to the high-dimensional image space. While uncertainty is sensitive to clinically relevant cases, sensitivity to unfamiliar data samples is task dependent, but can be rendered more robust. |
format | Online Article Text |
id | pubmed-5736701 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2017 |
publisher | Nature Publishing Group UK |
record_format | MEDLINE/PubMed |
spelling | pubmed-57367012017-12-21 Leveraging uncertainty information from deep neural networks for disease detection Leibig, Christian Allken, Vaneeda Ayhan, Murat Seçkin Berens, Philipp Wahl, Siegfried Sci Rep Article Deep learning (DL) has revolutionized the field of computer vision and image processing. In medical imaging, algorithmic solutions based on DL have been shown to achieve high performance on tasks that previously required medical experts. However, DL-based solutions for disease detection have been proposed without methods to quantify and control their uncertainty in a decision. In contrast, a physician knows whether she is uncertain about a case and will consult more experienced colleagues if needed. Here we evaluate drop-out based Bayesian uncertainty measures for DL in diagnosing diabetic retinopathy (DR) from fundus images and show that it captures uncertainty better than straightforward alternatives. Furthermore, we show that uncertainty informed decision referral can improve diagnostic performance. Experiments across different networks, tasks and datasets show robust generalization. Depending on network capacity and task/dataset difficulty, we surpass 85% sensitivity and 80% specificity as recommended by the NHS when referring 0−20% of the most uncertain decisions for further inspection. We analyse causes of uncertainty by relating intuitions from 2D visualizations to the high-dimensional image space. While uncertainty is sensitive to clinically relevant cases, sensitivity to unfamiliar data samples is task dependent, but can be rendered more robust. Nature Publishing Group UK 2017-12-19 /pmc/articles/PMC5736701/ /pubmed/29259224 http://dx.doi.org/10.1038/s41598-017-17876-z Text en © The Author(s) 2017 Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/. |
spellingShingle | Article Leibig, Christian Allken, Vaneeda Ayhan, Murat Seçkin Berens, Philipp Wahl, Siegfried Leveraging uncertainty information from deep neural networks for disease detection |
title | Leveraging uncertainty information from deep neural networks for disease detection |
title_full | Leveraging uncertainty information from deep neural networks for disease detection |
title_fullStr | Leveraging uncertainty information from deep neural networks for disease detection |
title_full_unstemmed | Leveraging uncertainty information from deep neural networks for disease detection |
title_short | Leveraging uncertainty information from deep neural networks for disease detection |
title_sort | leveraging uncertainty information from deep neural networks for disease detection |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5736701/ https://www.ncbi.nlm.nih.gov/pubmed/29259224 http://dx.doi.org/10.1038/s41598-017-17876-z |
work_keys_str_mv | AT leibigchristian leveraginguncertaintyinformationfromdeepneuralnetworksfordiseasedetection AT allkenvaneeda leveraginguncertaintyinformationfromdeepneuralnetworksfordiseasedetection AT ayhanmuratseckin leveraginguncertaintyinformationfromdeepneuralnetworksfordiseasedetection AT berensphilipp leveraginguncertaintyinformationfromdeepneuralnetworksfordiseasedetection AT wahlsiegfried leveraginguncertaintyinformationfromdeepneuralnetworksfordiseasedetection |