Cargando…
Artificial Intelligence to Improve Patient Understanding of Radiology Reports
Diagnostic imaging reports are generally written with a target audience of other providers. As a result, the reports are written with medical jargon and technical detail to ensure accurate communication. With implementation of the 21st Century Cures Act, patients have greater and quicker access to t...
Autores principales: | , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
YJBM
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10524809/ https://www.ncbi.nlm.nih.gov/pubmed/37780992 http://dx.doi.org/10.59249/NKOY5498 |
_version_ | 1785110688281133056 |
---|---|
author | Amin, Kanhai Khosla, Pavan Doshi, Rushabh Chheang, Sophie Forman, Howard P. |
author_facet | Amin, Kanhai Khosla, Pavan Doshi, Rushabh Chheang, Sophie Forman, Howard P. |
author_sort | Amin, Kanhai |
collection | PubMed |
description | Diagnostic imaging reports are generally written with a target audience of other providers. As a result, the reports are written with medical jargon and technical detail to ensure accurate communication. With implementation of the 21st Century Cures Act, patients have greater and quicker access to their imaging reports, but these reports are still written above the comprehension level of the average patient. Consequently, many patients have requested reports to be conveyed in language accessible to them. Numerous studies have shown that improving patient understanding of their condition results in better outcomes, so driving comprehension of imaging reports is essential. Summary statements, second reports, and the inclusion of the radiologist’s phone number have been proposed, but these solutions have implications for radiologist workflow. Artificial intelligence (AI) has the potential to simplify imaging reports without significant disruptions. Many AI technologies have been applied to radiology reports in the past for various clinical and research purposes, but patient focused solutions have largely been ignored. New natural language processing technologies and large language models (LLMs) have the potential to improve patient understanding of their imaging reports. However, LLMs are a nascent technology and significant research is required before LLM-driven report simplification is used in patient care. |
format | Online Article Text |
id | pubmed-10524809 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | YJBM |
record_format | MEDLINE/PubMed |
spelling | pubmed-105248092023-09-29 Artificial Intelligence to Improve Patient Understanding of Radiology Reports Amin, Kanhai Khosla, Pavan Doshi, Rushabh Chheang, Sophie Forman, Howard P. Yale J Biol Med Mini-Review Diagnostic imaging reports are generally written with a target audience of other providers. As a result, the reports are written with medical jargon and technical detail to ensure accurate communication. With implementation of the 21st Century Cures Act, patients have greater and quicker access to their imaging reports, but these reports are still written above the comprehension level of the average patient. Consequently, many patients have requested reports to be conveyed in language accessible to them. Numerous studies have shown that improving patient understanding of their condition results in better outcomes, so driving comprehension of imaging reports is essential. Summary statements, second reports, and the inclusion of the radiologist’s phone number have been proposed, but these solutions have implications for radiologist workflow. Artificial intelligence (AI) has the potential to simplify imaging reports without significant disruptions. Many AI technologies have been applied to radiology reports in the past for various clinical and research purposes, but patient focused solutions have largely been ignored. New natural language processing technologies and large language models (LLMs) have the potential to improve patient understanding of their imaging reports. However, LLMs are a nascent technology and significant research is required before LLM-driven report simplification is used in patient care. YJBM 2023-09-29 /pmc/articles/PMC10524809/ /pubmed/37780992 http://dx.doi.org/10.59249/NKOY5498 Text en Copyright ©2023, Yale Journal of Biology and Medicine https://creativecommons.org/licenses/by-nc/4.0/This is an open access article distributed under the terms of the Creative Commons CC BY-NC license, which permits use, distribution, and reproduction in any medium, provided the original work is properly cited. You may not use the material for commercial purposes. |
spellingShingle | Mini-Review Amin, Kanhai Khosla, Pavan Doshi, Rushabh Chheang, Sophie Forman, Howard P. Artificial Intelligence to Improve Patient Understanding of Radiology Reports |
title | Artificial Intelligence to Improve Patient Understanding of Radiology
Reports |
title_full | Artificial Intelligence to Improve Patient Understanding of Radiology
Reports |
title_fullStr | Artificial Intelligence to Improve Patient Understanding of Radiology
Reports |
title_full_unstemmed | Artificial Intelligence to Improve Patient Understanding of Radiology
Reports |
title_short | Artificial Intelligence to Improve Patient Understanding of Radiology
Reports |
title_sort | artificial intelligence to improve patient understanding of radiology
reports |
topic | Mini-Review |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10524809/ https://www.ncbi.nlm.nih.gov/pubmed/37780992 http://dx.doi.org/10.59249/NKOY5498 |
work_keys_str_mv | AT aminkanhai artificialintelligencetoimprovepatientunderstandingofradiologyreports AT khoslapavan artificialintelligencetoimprovepatientunderstandingofradiologyreports AT doshirushabh artificialintelligencetoimprovepatientunderstandingofradiologyreports AT chheangsophie artificialintelligencetoimprovepatientunderstandingofradiologyreports AT formanhowardp artificialintelligencetoimprovepatientunderstandingofradiologyreports |