Cargando…

Explainable artificial intelligence (XAI) in radiology and nuclear medicine: a literature review

RATIONAL: Deep learning (DL) has demonstrated a remarkable performance in diagnostic imaging for various diseases and modalities and therefore has a high potential to be used as a clinical tool. However, current practice shows low deployment of these algorithms in clinical practice, because DL algor...

Descripción completa

Detalles Bibliográficos
Autores principales: de Vries, Bart M., Zwezerijnen, Gerben J. C., Burchell, George L., van Velden, Floris H. P., Menke-van der Houven van Oordt, Catharina Willemien, Boellaard, Ronald
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10213317/
https://www.ncbi.nlm.nih.gov/pubmed/37250654
http://dx.doi.org/10.3389/fmed.2023.1180773
_version_ 1785047593862037504
author de Vries, Bart M.
Zwezerijnen, Gerben J. C.
Burchell, George L.
van Velden, Floris H. P.
Menke-van der Houven van Oordt, Catharina Willemien
Boellaard, Ronald
author_facet de Vries, Bart M.
Zwezerijnen, Gerben J. C.
Burchell, George L.
van Velden, Floris H. P.
Menke-van der Houven van Oordt, Catharina Willemien
Boellaard, Ronald
author_sort de Vries, Bart M.
collection PubMed
description RATIONAL: Deep learning (DL) has demonstrated a remarkable performance in diagnostic imaging for various diseases and modalities and therefore has a high potential to be used as a clinical tool. However, current practice shows low deployment of these algorithms in clinical practice, because DL algorithms lack transparency and trust due to their underlying black-box mechanism. For successful employment, explainable artificial intelligence (XAI) could be introduced to close the gap between the medical professionals and the DL algorithms. In this literature review, XAI methods available for magnetic resonance (MR), computed tomography (CT), and positron emission tomography (PET) imaging are discussed and future suggestions are made. METHODS: PubMed, Embase.com and Clarivate Analytics/Web of Science Core Collection were screened. Articles were considered eligible for inclusion if XAI was used (and well described) to describe the behavior of a DL model used in MR, CT and PET imaging. RESULTS: A total of 75 articles were included of which 54 and 17 articles described post and ad hoc XAI methods, respectively, and 4 articles described both XAI methods. Major variations in performance is seen between the methods. Overall, post hoc XAI lacks the ability to provide class-discriminative and target-specific explanation. Ad hoc XAI seems to tackle this because of its intrinsic ability to explain. However, quality control of the XAI methods is rarely applied and therefore systematic comparison between the methods is difficult. CONCLUSION: There is currently no clear consensus on how XAI should be deployed in order to close the gap between medical professionals and DL algorithms for clinical implementation. We advocate for systematic technical and clinical quality assessment of XAI methods. Also, to ensure end-to-end unbiased and safe integration of XAI in clinical workflow, (anatomical) data minimization and quality control methods should be included.
format Online
Article
Text
id pubmed-10213317
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Frontiers Media S.A.
record_format MEDLINE/PubMed
spelling pubmed-102133172023-05-27 Explainable artificial intelligence (XAI) in radiology and nuclear medicine: a literature review de Vries, Bart M. Zwezerijnen, Gerben J. C. Burchell, George L. van Velden, Floris H. P. Menke-van der Houven van Oordt, Catharina Willemien Boellaard, Ronald Front Med (Lausanne) Medicine RATIONAL: Deep learning (DL) has demonstrated a remarkable performance in diagnostic imaging for various diseases and modalities and therefore has a high potential to be used as a clinical tool. However, current practice shows low deployment of these algorithms in clinical practice, because DL algorithms lack transparency and trust due to their underlying black-box mechanism. For successful employment, explainable artificial intelligence (XAI) could be introduced to close the gap between the medical professionals and the DL algorithms. In this literature review, XAI methods available for magnetic resonance (MR), computed tomography (CT), and positron emission tomography (PET) imaging are discussed and future suggestions are made. METHODS: PubMed, Embase.com and Clarivate Analytics/Web of Science Core Collection were screened. Articles were considered eligible for inclusion if XAI was used (and well described) to describe the behavior of a DL model used in MR, CT and PET imaging. RESULTS: A total of 75 articles were included of which 54 and 17 articles described post and ad hoc XAI methods, respectively, and 4 articles described both XAI methods. Major variations in performance is seen between the methods. Overall, post hoc XAI lacks the ability to provide class-discriminative and target-specific explanation. Ad hoc XAI seems to tackle this because of its intrinsic ability to explain. However, quality control of the XAI methods is rarely applied and therefore systematic comparison between the methods is difficult. CONCLUSION: There is currently no clear consensus on how XAI should be deployed in order to close the gap between medical professionals and DL algorithms for clinical implementation. We advocate for systematic technical and clinical quality assessment of XAI methods. Also, to ensure end-to-end unbiased and safe integration of XAI in clinical workflow, (anatomical) data minimization and quality control methods should be included. Frontiers Media S.A. 2023-05-12 /pmc/articles/PMC10213317/ /pubmed/37250654 http://dx.doi.org/10.3389/fmed.2023.1180773 Text en Copyright © 2023 de Vries, Zwezerijnen, Burchell, van Velden, Menke-van der Houven van Oordt and Boellaard. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.
spellingShingle Medicine
de Vries, Bart M.
Zwezerijnen, Gerben J. C.
Burchell, George L.
van Velden, Floris H. P.
Menke-van der Houven van Oordt, Catharina Willemien
Boellaard, Ronald
Explainable artificial intelligence (XAI) in radiology and nuclear medicine: a literature review
title Explainable artificial intelligence (XAI) in radiology and nuclear medicine: a literature review
title_full Explainable artificial intelligence (XAI) in radiology and nuclear medicine: a literature review
title_fullStr Explainable artificial intelligence (XAI) in radiology and nuclear medicine: a literature review
title_full_unstemmed Explainable artificial intelligence (XAI) in radiology and nuclear medicine: a literature review
title_short Explainable artificial intelligence (XAI) in radiology and nuclear medicine: a literature review
title_sort explainable artificial intelligence (xai) in radiology and nuclear medicine: a literature review
topic Medicine
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10213317/
https://www.ncbi.nlm.nih.gov/pubmed/37250654
http://dx.doi.org/10.3389/fmed.2023.1180773
work_keys_str_mv AT devriesbartm explainableartificialintelligencexaiinradiologyandnuclearmedicinealiteraturereview
AT zwezerijnengerbenjc explainableartificialintelligencexaiinradiologyandnuclearmedicinealiteraturereview
AT burchellgeorgel explainableartificialintelligencexaiinradiologyandnuclearmedicinealiteraturereview
AT vanveldenflorishp explainableartificialintelligencexaiinradiologyandnuclearmedicinealiteraturereview
AT menkevanderhouvenvanoordtcatharinawillemien explainableartificialintelligencexaiinradiologyandnuclearmedicinealiteraturereview
AT boellaardronald explainableartificialintelligencexaiinradiologyandnuclearmedicinealiteraturereview