Cargando…

Learning Analytics Applied to Clinical Diagnostic Reasoning Using a Natural Language Processing–Based Virtual Patient Simulator: Case Study

BACKGROUND: Virtual patient simulators (VPSs) log all users’ actions, thereby enabling the creation of a multidimensional representation of students’ medical knowledge. This representation can be used to create metrics providing teachers with valuable learning information. OBJECTIVE: The aim of this...

Descripción completa

Detalles Bibliográficos
Autores principales: Furlan, Raffaello, Gatti, Mauro, Mene, Roberto, Shiffer, Dana, Marchiori, Chiara, Giaj Levra, Alessandro, Saturnino, Vincenzo, Brunetta, Enrico, Dipaola, Franca
Formato: Online Artículo Texto
Lenguaje:English
Publicado: JMIR Publications 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8931645/
https://www.ncbi.nlm.nih.gov/pubmed/35238786
http://dx.doi.org/10.2196/24372
_version_ 1784671305959735296
author Furlan, Raffaello
Gatti, Mauro
Mene, Roberto
Shiffer, Dana
Marchiori, Chiara
Giaj Levra, Alessandro
Saturnino, Vincenzo
Brunetta, Enrico
Dipaola, Franca
author_facet Furlan, Raffaello
Gatti, Mauro
Mene, Roberto
Shiffer, Dana
Marchiori, Chiara
Giaj Levra, Alessandro
Saturnino, Vincenzo
Brunetta, Enrico
Dipaola, Franca
author_sort Furlan, Raffaello
collection PubMed
description BACKGROUND: Virtual patient simulators (VPSs) log all users’ actions, thereby enabling the creation of a multidimensional representation of students’ medical knowledge. This representation can be used to create metrics providing teachers with valuable learning information. OBJECTIVE: The aim of this study is to describe the metrics we developed to analyze the clinical diagnostic reasoning of medical students, provide examples of their application, and preliminarily validate these metrics on a class of undergraduate medical students. The metrics are computed from the data obtained through a novel VPS embedding natural language processing techniques. METHODS: A total of 2 clinical case simulations (tests) were created to test our metrics. During each simulation, the students’ step-by-step actions were logged into the program database for offline analysis. The students’ performance was divided into seven dimensions: the identification of relevant information in the given clinical scenario, history taking, physical examination, medical test ordering, diagnostic hypothesis setting, binary analysis fulfillment, and final diagnosis setting. Sensitivity (percentage of relevant information found) and precision (percentage of correct actions performed) metrics were computed for each issue and combined into a harmonic mean (F(1)), thereby obtaining a single score evaluating the students’ performance. The 7 metrics were further grouped to reflect the students’ capability to collect and to analyze information to obtain an overall performance score. A methodological score was computed based on the discordance between the diagnostic pathway followed by students and the reference one previously defined by the teacher. In total, 25 students attending the fifth year of the School of Medicine at Humanitas University underwent test 1, which simulated a patient with dyspnea. Test 2 dealt with abdominal pain and was attended by 36 students on a different day. For validation, we assessed the Spearman rank correlation between the performance on these scores and the score obtained by each student in the hematology curricular examination. RESULTS: The mean overall scores were consistent between test 1 (mean 0.59, SD 0.05) and test 2 (mean 0.54, SD 0.12). For each student, the overall performance was achieved through a different contribution in collecting and analyzing information. Methodological scores highlighted discordances between the reference diagnostic pattern previously set by the teacher and the one pursued by the student. No significant correlation was found between the VPS scores and hematology examination scores. CONCLUSIONS: Different components of the students’ diagnostic process may be disentangled and quantified by appropriate metrics applied to students’ actions recorded while addressing a virtual case. Such an approach may help teachers provide students with individualized feedback aimed at filling competence drawbacks and methodological inconsistencies. There was no correlation between the hematology curricular examination score and any of the proposed scores as these scores address different aspects of students’ medical knowledge.
format Online
Article
Text
id pubmed-8931645
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher JMIR Publications
record_format MEDLINE/PubMed
spelling pubmed-89316452022-03-19 Learning Analytics Applied to Clinical Diagnostic Reasoning Using a Natural Language Processing–Based Virtual Patient Simulator: Case Study Furlan, Raffaello Gatti, Mauro Mene, Roberto Shiffer, Dana Marchiori, Chiara Giaj Levra, Alessandro Saturnino, Vincenzo Brunetta, Enrico Dipaola, Franca JMIR Med Educ Original Paper BACKGROUND: Virtual patient simulators (VPSs) log all users’ actions, thereby enabling the creation of a multidimensional representation of students’ medical knowledge. This representation can be used to create metrics providing teachers with valuable learning information. OBJECTIVE: The aim of this study is to describe the metrics we developed to analyze the clinical diagnostic reasoning of medical students, provide examples of their application, and preliminarily validate these metrics on a class of undergraduate medical students. The metrics are computed from the data obtained through a novel VPS embedding natural language processing techniques. METHODS: A total of 2 clinical case simulations (tests) were created to test our metrics. During each simulation, the students’ step-by-step actions were logged into the program database for offline analysis. The students’ performance was divided into seven dimensions: the identification of relevant information in the given clinical scenario, history taking, physical examination, medical test ordering, diagnostic hypothesis setting, binary analysis fulfillment, and final diagnosis setting. Sensitivity (percentage of relevant information found) and precision (percentage of correct actions performed) metrics were computed for each issue and combined into a harmonic mean (F(1)), thereby obtaining a single score evaluating the students’ performance. The 7 metrics were further grouped to reflect the students’ capability to collect and to analyze information to obtain an overall performance score. A methodological score was computed based on the discordance between the diagnostic pathway followed by students and the reference one previously defined by the teacher. In total, 25 students attending the fifth year of the School of Medicine at Humanitas University underwent test 1, which simulated a patient with dyspnea. Test 2 dealt with abdominal pain and was attended by 36 students on a different day. For validation, we assessed the Spearman rank correlation between the performance on these scores and the score obtained by each student in the hematology curricular examination. RESULTS: The mean overall scores were consistent between test 1 (mean 0.59, SD 0.05) and test 2 (mean 0.54, SD 0.12). For each student, the overall performance was achieved through a different contribution in collecting and analyzing information. Methodological scores highlighted discordances between the reference diagnostic pattern previously set by the teacher and the one pursued by the student. No significant correlation was found between the VPS scores and hematology examination scores. CONCLUSIONS: Different components of the students’ diagnostic process may be disentangled and quantified by appropriate metrics applied to students’ actions recorded while addressing a virtual case. Such an approach may help teachers provide students with individualized feedback aimed at filling competence drawbacks and methodological inconsistencies. There was no correlation between the hematology curricular examination score and any of the proposed scores as these scores address different aspects of students’ medical knowledge. JMIR Publications 2022-03-03 /pmc/articles/PMC8931645/ /pubmed/35238786 http://dx.doi.org/10.2196/24372 Text en ©Raffaello Furlan, Mauro Gatti, Roberto Mene, Dana Shiffer, Chiara Marchiori, Alessandro Giaj Levra, Vincenzo Saturnino, Enrico Brunetta, Franca Dipaola. Originally published in JMIR Medical Education (https://mededu.jmir.org), 03.03.2022. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Medical Education, is properly cited. The complete bibliographic information, a link to the original publication on https://mededu.jmir.org/, as well as this copyright and license information must be included.
spellingShingle Original Paper
Furlan, Raffaello
Gatti, Mauro
Mene, Roberto
Shiffer, Dana
Marchiori, Chiara
Giaj Levra, Alessandro
Saturnino, Vincenzo
Brunetta, Enrico
Dipaola, Franca
Learning Analytics Applied to Clinical Diagnostic Reasoning Using a Natural Language Processing–Based Virtual Patient Simulator: Case Study
title Learning Analytics Applied to Clinical Diagnostic Reasoning Using a Natural Language Processing–Based Virtual Patient Simulator: Case Study
title_full Learning Analytics Applied to Clinical Diagnostic Reasoning Using a Natural Language Processing–Based Virtual Patient Simulator: Case Study
title_fullStr Learning Analytics Applied to Clinical Diagnostic Reasoning Using a Natural Language Processing–Based Virtual Patient Simulator: Case Study
title_full_unstemmed Learning Analytics Applied to Clinical Diagnostic Reasoning Using a Natural Language Processing–Based Virtual Patient Simulator: Case Study
title_short Learning Analytics Applied to Clinical Diagnostic Reasoning Using a Natural Language Processing–Based Virtual Patient Simulator: Case Study
title_sort learning analytics applied to clinical diagnostic reasoning using a natural language processing–based virtual patient simulator: case study
topic Original Paper
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8931645/
https://www.ncbi.nlm.nih.gov/pubmed/35238786
http://dx.doi.org/10.2196/24372
work_keys_str_mv AT furlanraffaello learninganalyticsappliedtoclinicaldiagnosticreasoningusinganaturallanguageprocessingbasedvirtualpatientsimulatorcasestudy
AT gattimauro learninganalyticsappliedtoclinicaldiagnosticreasoningusinganaturallanguageprocessingbasedvirtualpatientsimulatorcasestudy
AT meneroberto learninganalyticsappliedtoclinicaldiagnosticreasoningusinganaturallanguageprocessingbasedvirtualpatientsimulatorcasestudy
AT shifferdana learninganalyticsappliedtoclinicaldiagnosticreasoningusinganaturallanguageprocessingbasedvirtualpatientsimulatorcasestudy
AT marchiorichiara learninganalyticsappliedtoclinicaldiagnosticreasoningusinganaturallanguageprocessingbasedvirtualpatientsimulatorcasestudy
AT giajlevraalessandro learninganalyticsappliedtoclinicaldiagnosticreasoningusinganaturallanguageprocessingbasedvirtualpatientsimulatorcasestudy
AT saturninovincenzo learninganalyticsappliedtoclinicaldiagnosticreasoningusinganaturallanguageprocessingbasedvirtualpatientsimulatorcasestudy
AT brunettaenrico learninganalyticsappliedtoclinicaldiagnosticreasoningusinganaturallanguageprocessingbasedvirtualpatientsimulatorcasestudy
AT dipaolafranca learninganalyticsappliedtoclinicaldiagnosticreasoningusinganaturallanguageprocessingbasedvirtualpatientsimulatorcasestudy