Cargando…
Leveraging human expert image annotations to improve pneumonia differentiation through human knowledge distillation
In medical imaging, deep learning models can be a critical tool to shorten time-to-diagnosis and support specialized medical staff in clinical decision making. The successful training of deep learning models usually requires large amounts of quality data, which are often not available in many medica...
Autores principales: | , , , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10243236/ https://www.ncbi.nlm.nih.gov/pubmed/37280219 http://dx.doi.org/10.1038/s41598-023-36148-7 |
_version_ | 1785054384913121280 |
---|---|
author | Schaudt, Daniel von Schwerin, Reinhold Hafner, Alexander Riedel, Pascal Späte, Christian Reichert, Manfred Hinteregger, Andreas Beer, Meinrad Kloth, Christopher |
author_facet | Schaudt, Daniel von Schwerin, Reinhold Hafner, Alexander Riedel, Pascal Späte, Christian Reichert, Manfred Hinteregger, Andreas Beer, Meinrad Kloth, Christopher |
author_sort | Schaudt, Daniel |
collection | PubMed |
description | In medical imaging, deep learning models can be a critical tool to shorten time-to-diagnosis and support specialized medical staff in clinical decision making. The successful training of deep learning models usually requires large amounts of quality data, which are often not available in many medical imaging tasks. In this work we train a deep learning model on university hospital chest X-ray data, containing 1082 images. The data was reviewed, differentiated into 4 causes for pneumonia, and annotated by an expert radiologist. To successfully train a model on this small amount of complex image data, we propose a special knowledge distillation process, which we call Human Knowledge Distillation. This process enables deep learning models to utilize annotated regions in the images during the training process. This form of guidance by a human expert improves model convergence and performance. We evaluate the proposed process on our study data for multiple types of models, all of which show improved results. The best model of this study, called PneuKnowNet, shows an improvement of + 2.3% points in overall accuracy compared to a baseline model and also leads to more meaningful decision regions. Utilizing this implicit data quality-quantity trade-off can be a promising approach for many scarce data domains beyond medical imaging. |
format | Online Article Text |
id | pubmed-10243236 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | Nature Publishing Group UK |
record_format | MEDLINE/PubMed |
spelling | pubmed-102432362023-06-07 Leveraging human expert image annotations to improve pneumonia differentiation through human knowledge distillation Schaudt, Daniel von Schwerin, Reinhold Hafner, Alexander Riedel, Pascal Späte, Christian Reichert, Manfred Hinteregger, Andreas Beer, Meinrad Kloth, Christopher Sci Rep Article In medical imaging, deep learning models can be a critical tool to shorten time-to-diagnosis and support specialized medical staff in clinical decision making. The successful training of deep learning models usually requires large amounts of quality data, which are often not available in many medical imaging tasks. In this work we train a deep learning model on university hospital chest X-ray data, containing 1082 images. The data was reviewed, differentiated into 4 causes for pneumonia, and annotated by an expert radiologist. To successfully train a model on this small amount of complex image data, we propose a special knowledge distillation process, which we call Human Knowledge Distillation. This process enables deep learning models to utilize annotated regions in the images during the training process. This form of guidance by a human expert improves model convergence and performance. We evaluate the proposed process on our study data for multiple types of models, all of which show improved results. The best model of this study, called PneuKnowNet, shows an improvement of + 2.3% points in overall accuracy compared to a baseline model and also leads to more meaningful decision regions. Utilizing this implicit data quality-quantity trade-off can be a promising approach for many scarce data domains beyond medical imaging. Nature Publishing Group UK 2023-06-06 /pmc/articles/PMC10243236/ /pubmed/37280219 http://dx.doi.org/10.1038/s41598-023-36148-7 Text en © The Author(s) 2023 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . |
spellingShingle | Article Schaudt, Daniel von Schwerin, Reinhold Hafner, Alexander Riedel, Pascal Späte, Christian Reichert, Manfred Hinteregger, Andreas Beer, Meinrad Kloth, Christopher Leveraging human expert image annotations to improve pneumonia differentiation through human knowledge distillation |
title | Leveraging human expert image annotations to improve pneumonia differentiation through human knowledge distillation |
title_full | Leveraging human expert image annotations to improve pneumonia differentiation through human knowledge distillation |
title_fullStr | Leveraging human expert image annotations to improve pneumonia differentiation through human knowledge distillation |
title_full_unstemmed | Leveraging human expert image annotations to improve pneumonia differentiation through human knowledge distillation |
title_short | Leveraging human expert image annotations to improve pneumonia differentiation through human knowledge distillation |
title_sort | leveraging human expert image annotations to improve pneumonia differentiation through human knowledge distillation |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10243236/ https://www.ncbi.nlm.nih.gov/pubmed/37280219 http://dx.doi.org/10.1038/s41598-023-36148-7 |
work_keys_str_mv | AT schaudtdaniel leveraginghumanexpertimageannotationstoimprovepneumoniadifferentiationthroughhumanknowledgedistillation AT vonschwerinreinhold leveraginghumanexpertimageannotationstoimprovepneumoniadifferentiationthroughhumanknowledgedistillation AT hafneralexander leveraginghumanexpertimageannotationstoimprovepneumoniadifferentiationthroughhumanknowledgedistillation AT riedelpascal leveraginghumanexpertimageannotationstoimprovepneumoniadifferentiationthroughhumanknowledgedistillation AT spatechristian leveraginghumanexpertimageannotationstoimprovepneumoniadifferentiationthroughhumanknowledgedistillation AT reichertmanfred leveraginghumanexpertimageannotationstoimprovepneumoniadifferentiationthroughhumanknowledgedistillation AT hintereggerandreas leveraginghumanexpertimageannotationstoimprovepneumoniadifferentiationthroughhumanknowledgedistillation AT beermeinrad leveraginghumanexpertimageannotationstoimprovepneumoniadifferentiationthroughhumanknowledgedistillation AT klothchristopher leveraginghumanexpertimageannotationstoimprovepneumoniadifferentiationthroughhumanknowledgedistillation |