Cargando…

Non-task expert physicians benefit from correct explainable AI advice when reviewing X-rays

Artificial intelligence (AI)-generated clinical advice is becoming more prevalent in healthcare. However, the impact of AI-generated advice on physicians’ decision-making is underexplored. In this study, physicians received X-rays with correct diagnostic advice and were asked to make a diagnosis, ra...

Descripción completa

Detalles Bibliográficos
Autores principales: Gaube, Susanne, Suresh, Harini, Raue, Martina, Lermer, Eva, Koch, Timo K., Hudecek, Matthias F. C., Ackery, Alun D., Grover, Samir C., Coughlin, Joseph F., Frey, Dieter, Kitamura, Felipe C., Ghassemi, Marzyeh, Colak, Errol
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9876883/
https://www.ncbi.nlm.nih.gov/pubmed/36697450
http://dx.doi.org/10.1038/s41598-023-28633-w
Descripción
Sumario:Artificial intelligence (AI)-generated clinical advice is becoming more prevalent in healthcare. However, the impact of AI-generated advice on physicians’ decision-making is underexplored. In this study, physicians received X-rays with correct diagnostic advice and were asked to make a diagnosis, rate the advice’s quality, and judge their own confidence. We manipulated whether the advice came with or without a visual annotation on the X-rays, and whether it was labeled as coming from an AI or a human radiologist. Overall, receiving annotated advice from an AI resulted in the highest diagnostic accuracy. Physicians rated the quality of AI advice higher than human advice. We did not find a strong effect of either manipulation on participants’ confidence. The magnitude of the effects varied between task experts and non-task experts, with the latter benefiting considerably from correct explainable AI advice. These findings raise important considerations for the deployment of diagnostic advice in healthcare.