Cargando…
Radiology artificial intelligence: a systematic review and evaluation of methods (RAISE)
OBJECTIVE: There has been a large amount of research in the field of artificial intelligence (AI) as applied to clinical radiology. However, these studies vary in design and quality and systematic reviews of the entire field are lacking.This systematic review aimed to identify all papers that used d...
Autores principales: | , , , , , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Springer Berlin Heidelberg
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9668941/ https://www.ncbi.nlm.nih.gov/pubmed/35420305 http://dx.doi.org/10.1007/s00330-022-08784-6 |
_version_ | 1784832023714594816 |
---|---|
author | Kelly, Brendan S. Judge, Conor Bollard, Stephanie M. Clifford, Simon M. Healy, Gerard M. Aziz, Awsam Mathur, Prateek Islam, Shah Yeom, Kristen W. Lawlor, Aonghus Killeen, Ronan P. |
author_facet | Kelly, Brendan S. Judge, Conor Bollard, Stephanie M. Clifford, Simon M. Healy, Gerard M. Aziz, Awsam Mathur, Prateek Islam, Shah Yeom, Kristen W. Lawlor, Aonghus Killeen, Ronan P. |
author_sort | Kelly, Brendan S. |
collection | PubMed |
description | OBJECTIVE: There has been a large amount of research in the field of artificial intelligence (AI) as applied to clinical radiology. However, these studies vary in design and quality and systematic reviews of the entire field are lacking.This systematic review aimed to identify all papers that used deep learning in radiology to survey the literature and to evaluate their methods. We aimed to identify the key questions being addressed in the literature and to identify the most effective methods employed. METHODS: We followed the PRISMA guidelines and performed a systematic review of studies of AI in radiology published from 2015 to 2019. Our published protocol was prospectively registered. RESULTS: Our search yielded 11,083 results. Seven hundred sixty-seven full texts were reviewed, and 535 articles were included. Ninety-eight percent were retrospective cohort studies. The median number of patients included was 460. Most studies involved MRI (37%). Neuroradiology was the most common subspecialty. Eighty-eight percent used supervised learning. The majority of studies undertook a segmentation task (39%). Performance comparison was with a state-of-the-art model in 37%. The most used established architecture was UNet (14%). The median performance for the most utilised evaluation metrics was Dice of 0.89 (range .49–.99), AUC of 0.903 (range 1.00–0.61) and Accuracy of 89.4 (range 70.2–100). Of the 77 studies that externally validated their results and allowed for direct comparison, performance on average decreased by 6% at external validation (range increase of 4% to decrease 44%). CONCLUSION: This systematic review has surveyed the major advances in AI as applied to clinical radiology. KEY POINTS: • While there are many papers reporting expert-level results by using deep learning in radiology, most apply only a narrow range of techniques to a narrow selection of use cases. • The literature is dominated by retrospective cohort studies with limited external validation with high potential for bias. • The recent advent of AI extensions to systematic reporting guidelines and prospective trial registration along with a focus on external validation and explanations show potential for translation of the hype surrounding AI from code to clinic. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1007/s00330-022-08784-6. |
format | Online Article Text |
id | pubmed-9668941 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | Springer Berlin Heidelberg |
record_format | MEDLINE/PubMed |
spelling | pubmed-96689412022-11-18 Radiology artificial intelligence: a systematic review and evaluation of methods (RAISE) Kelly, Brendan S. Judge, Conor Bollard, Stephanie M. Clifford, Simon M. Healy, Gerard M. Aziz, Awsam Mathur, Prateek Islam, Shah Yeom, Kristen W. Lawlor, Aonghus Killeen, Ronan P. Eur Radiol Imaging Informatics and Artificial Intelligence OBJECTIVE: There has been a large amount of research in the field of artificial intelligence (AI) as applied to clinical radiology. However, these studies vary in design and quality and systematic reviews of the entire field are lacking.This systematic review aimed to identify all papers that used deep learning in radiology to survey the literature and to evaluate their methods. We aimed to identify the key questions being addressed in the literature and to identify the most effective methods employed. METHODS: We followed the PRISMA guidelines and performed a systematic review of studies of AI in radiology published from 2015 to 2019. Our published protocol was prospectively registered. RESULTS: Our search yielded 11,083 results. Seven hundred sixty-seven full texts were reviewed, and 535 articles were included. Ninety-eight percent were retrospective cohort studies. The median number of patients included was 460. Most studies involved MRI (37%). Neuroradiology was the most common subspecialty. Eighty-eight percent used supervised learning. The majority of studies undertook a segmentation task (39%). Performance comparison was with a state-of-the-art model in 37%. The most used established architecture was UNet (14%). The median performance for the most utilised evaluation metrics was Dice of 0.89 (range .49–.99), AUC of 0.903 (range 1.00–0.61) and Accuracy of 89.4 (range 70.2–100). Of the 77 studies that externally validated their results and allowed for direct comparison, performance on average decreased by 6% at external validation (range increase of 4% to decrease 44%). CONCLUSION: This systematic review has surveyed the major advances in AI as applied to clinical radiology. KEY POINTS: • While there are many papers reporting expert-level results by using deep learning in radiology, most apply only a narrow range of techniques to a narrow selection of use cases. • The literature is dominated by retrospective cohort studies with limited external validation with high potential for bias. • The recent advent of AI extensions to systematic reporting guidelines and prospective trial registration along with a focus on external validation and explanations show potential for translation of the hype surrounding AI from code to clinic. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1007/s00330-022-08784-6. Springer Berlin Heidelberg 2022-04-14 2022 /pmc/articles/PMC9668941/ /pubmed/35420305 http://dx.doi.org/10.1007/s00330-022-08784-6 Text en © The Author(s) 2022, corrected publication 2022 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . |
spellingShingle | Imaging Informatics and Artificial Intelligence Kelly, Brendan S. Judge, Conor Bollard, Stephanie M. Clifford, Simon M. Healy, Gerard M. Aziz, Awsam Mathur, Prateek Islam, Shah Yeom, Kristen W. Lawlor, Aonghus Killeen, Ronan P. Radiology artificial intelligence: a systematic review and evaluation of methods (RAISE) |
title | Radiology artificial intelligence: a systematic review and evaluation of methods (RAISE) |
title_full | Radiology artificial intelligence: a systematic review and evaluation of methods (RAISE) |
title_fullStr | Radiology artificial intelligence: a systematic review and evaluation of methods (RAISE) |
title_full_unstemmed | Radiology artificial intelligence: a systematic review and evaluation of methods (RAISE) |
title_short | Radiology artificial intelligence: a systematic review and evaluation of methods (RAISE) |
title_sort | radiology artificial intelligence: a systematic review and evaluation of methods (raise) |
topic | Imaging Informatics and Artificial Intelligence |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9668941/ https://www.ncbi.nlm.nih.gov/pubmed/35420305 http://dx.doi.org/10.1007/s00330-022-08784-6 |
work_keys_str_mv | AT kellybrendans radiologyartificialintelligenceasystematicreviewandevaluationofmethodsraise AT judgeconor radiologyartificialintelligenceasystematicreviewandevaluationofmethodsraise AT bollardstephaniem radiologyartificialintelligenceasystematicreviewandevaluationofmethodsraise AT cliffordsimonm radiologyartificialintelligenceasystematicreviewandevaluationofmethodsraise AT healygerardm radiologyartificialintelligenceasystematicreviewandevaluationofmethodsraise AT azizawsam radiologyartificialintelligenceasystematicreviewandevaluationofmethodsraise AT mathurprateek radiologyartificialintelligenceasystematicreviewandevaluationofmethodsraise AT islamshah radiologyartificialintelligenceasystematicreviewandevaluationofmethodsraise AT yeomkristenw radiologyartificialintelligenceasystematicreviewandevaluationofmethodsraise AT lawloraonghus radiologyartificialintelligenceasystematicreviewandevaluationofmethodsraise AT killeenronanp radiologyartificialintelligenceasystematicreviewandevaluationofmethodsraise |