Cargando…

Efficiency, Usability, and Outcomes of Proctored Next-Level Exams for Proficiency Testing in Primary Care Education: Observational Study

BACKGROUND: The COVID-19 pandemic has affected education and assessment programs and has resulted in complex planning. Therefore, we organized the proficiency test for admission to the Family Medicine program as a proctored exam. To prevent fraud, we developed a web-based supervisor app for tracking...

Descripción completa

Detalles Bibliográficos
Autores principales: Schoenmakers, Birgitte, Wens, Johan
Formato: Online Artículo Texto
Lenguaje:English
Publicado: JMIR Publications 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8406127/
https://www.ncbi.nlm.nih.gov/pubmed/34398786
http://dx.doi.org/10.2196/23834
_version_ 1783746459320975360
author Schoenmakers, Birgitte
Wens, Johan
author_facet Schoenmakers, Birgitte
Wens, Johan
author_sort Schoenmakers, Birgitte
collection PubMed
description BACKGROUND: The COVID-19 pandemic has affected education and assessment programs and has resulted in complex planning. Therefore, we organized the proficiency test for admission to the Family Medicine program as a proctored exam. To prevent fraud, we developed a web-based supervisor app for tracking and tracing candidates’ behaviors. OBJECTIVE: We aimed to assess the efficiency and usability of the proctored exam procedure and to analyze the procedure’s impact on exam scores. METHODS: The application operated on the following three levels to register events: the recording of actions, analyses of behavior, and live supervision. Each suspicious event was given a score. To assess efficiency, we logged the technical issues and the interventions. To test usability, we counted the number of suspicious students and behaviors. To analyze the impact that the supervisor app had on students’ exam outcomes, we compared the scores of the proctored group and those of the on-campus group. Candidates were free to register for off-campus participation or on-campus participation. RESULTS: Of the 593 candidates who subscribed to the exam, 472 (79.6%) used the supervisor app and 121 (20.4%) were on campus. The test results of both groups were comparable. We registered 15 technical issues that occurred off campus. Further, 2 candidates experienced a negative impact on their exams due to technical issues. The application detected 22 candidates with a suspicion rating of >1. Suspicion ratings mainly increased due to background noise. All events occurred without fraudulent intent. CONCLUSIONS: This pilot observational study demonstrated that a supervisor app that records and registers behavior was able to detect suspicious events without having an impact on exams. Background noise was the most critical event. There was no fraud detected. A supervisor app that registers and records behavior to prevent fraud during exams was efficient and did not affect exam outcomes. In future research, a controlled study design should be used to compare the cost-benefit balance between the complex interventions of the supervisor app and candidates’ awareness of being monitored via a safe browser plug-in for exams.
format Online
Article
Text
id pubmed-8406127
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher JMIR Publications
record_format MEDLINE/PubMed
spelling pubmed-84061272021-09-14 Efficiency, Usability, and Outcomes of Proctored Next-Level Exams for Proficiency Testing in Primary Care Education: Observational Study Schoenmakers, Birgitte Wens, Johan JMIR Form Res Original Paper BACKGROUND: The COVID-19 pandemic has affected education and assessment programs and has resulted in complex planning. Therefore, we organized the proficiency test for admission to the Family Medicine program as a proctored exam. To prevent fraud, we developed a web-based supervisor app for tracking and tracing candidates’ behaviors. OBJECTIVE: We aimed to assess the efficiency and usability of the proctored exam procedure and to analyze the procedure’s impact on exam scores. METHODS: The application operated on the following three levels to register events: the recording of actions, analyses of behavior, and live supervision. Each suspicious event was given a score. To assess efficiency, we logged the technical issues and the interventions. To test usability, we counted the number of suspicious students and behaviors. To analyze the impact that the supervisor app had on students’ exam outcomes, we compared the scores of the proctored group and those of the on-campus group. Candidates were free to register for off-campus participation or on-campus participation. RESULTS: Of the 593 candidates who subscribed to the exam, 472 (79.6%) used the supervisor app and 121 (20.4%) were on campus. The test results of both groups were comparable. We registered 15 technical issues that occurred off campus. Further, 2 candidates experienced a negative impact on their exams due to technical issues. The application detected 22 candidates with a suspicion rating of >1. Suspicion ratings mainly increased due to background noise. All events occurred without fraudulent intent. CONCLUSIONS: This pilot observational study demonstrated that a supervisor app that records and registers behavior was able to detect suspicious events without having an impact on exams. Background noise was the most critical event. There was no fraud detected. A supervisor app that registers and records behavior to prevent fraud during exams was efficient and did not affect exam outcomes. In future research, a controlled study design should be used to compare the cost-benefit balance between the complex interventions of the supervisor app and candidates’ awareness of being monitored via a safe browser plug-in for exams. JMIR Publications 2021-08-16 /pmc/articles/PMC8406127/ /pubmed/34398786 http://dx.doi.org/10.2196/23834 Text en ©Birgitte Schoenmakers, Johan Wens. Originally published in JMIR Formative Research (https://formative.jmir.org), 16.08.2021. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Formative Research, is properly cited. The complete bibliographic information, a link to the original publication on https://formative.jmir.org, as well as this copyright and license information must be included.
spellingShingle Original Paper
Schoenmakers, Birgitte
Wens, Johan
Efficiency, Usability, and Outcomes of Proctored Next-Level Exams for Proficiency Testing in Primary Care Education: Observational Study
title Efficiency, Usability, and Outcomes of Proctored Next-Level Exams for Proficiency Testing in Primary Care Education: Observational Study
title_full Efficiency, Usability, and Outcomes of Proctored Next-Level Exams for Proficiency Testing in Primary Care Education: Observational Study
title_fullStr Efficiency, Usability, and Outcomes of Proctored Next-Level Exams for Proficiency Testing in Primary Care Education: Observational Study
title_full_unstemmed Efficiency, Usability, and Outcomes of Proctored Next-Level Exams for Proficiency Testing in Primary Care Education: Observational Study
title_short Efficiency, Usability, and Outcomes of Proctored Next-Level Exams for Proficiency Testing in Primary Care Education: Observational Study
title_sort efficiency, usability, and outcomes of proctored next-level exams for proficiency testing in primary care education: observational study
topic Original Paper
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8406127/
https://www.ncbi.nlm.nih.gov/pubmed/34398786
http://dx.doi.org/10.2196/23834
work_keys_str_mv AT schoenmakersbirgitte efficiencyusabilityandoutcomesofproctorednextlevelexamsforproficiencytestinginprimarycareeducationobservationalstudy
AT wensjohan efficiencyusabilityandoutcomesofproctorednextlevelexamsforproficiencytestinginprimarycareeducationobservationalstudy