Cargando…

Crowdsource authoring as a tool for enhancing the quality of competency assessments in healthcare professions

The current Objective Structured Clinical Examination (OSCE) is complex, costly, and difficult to provide high-quality assessments. This pilot study employed a focus group and debugging stage to test the Crowdsource Authoring Assessment Tool (CAAT) for the creation and sharing of assessment tools us...

Descripción completa

Detalles Bibliográficos
Autores principales: Lin, Che-Wei, Clinciu, Daniel L., Salcedo, Daniel, Huang, Chih-Wei, Kang, Enoch Yi No, Li, Yu-Chuan (Jack)
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10621860/
https://www.ncbi.nlm.nih.gov/pubmed/37917751
http://dx.doi.org/10.1371/journal.pone.0278571
_version_ 1785130443921686528
author Lin, Che-Wei
Clinciu, Daniel L.
Salcedo, Daniel
Huang, Chih-Wei
Kang, Enoch Yi No
Li, Yu-Chuan (Jack)
author_facet Lin, Che-Wei
Clinciu, Daniel L.
Salcedo, Daniel
Huang, Chih-Wei
Kang, Enoch Yi No
Li, Yu-Chuan (Jack)
author_sort Lin, Che-Wei
collection PubMed
description The current Objective Structured Clinical Examination (OSCE) is complex, costly, and difficult to provide high-quality assessments. This pilot study employed a focus group and debugging stage to test the Crowdsource Authoring Assessment Tool (CAAT) for the creation and sharing of assessment tools used in editing and customizing, to match specific users’ needs, and to provide higher-quality checklists. Competency assessment international experts (n = 50) were asked to 1) participate in and experience the CAAT system when editing their own checklist, 2) edit a urinary catheterization checklist using CAAT, and 3) complete a Technology Acceptance Model (TAM) questionnaire consisting of 14 items to evaluate its four domains. The study occurred between October 2018 and May 2019. The median time for developing a new checklist using the CAAT was 65.76 minutes whereas the traditional method required 167.90 minutes. The CAAT system enabled quicker checklist creation and editing regardless of the experience and native language of participants. Participants also expressed the CAAT enhanced checklist development with 96% of them willing to recommend this tool to others. The use of a crowdsource authoring tool as revealed by this study has efficiently reduced the time to almost a third it would take when using the traditional method. In addition, it allows collaborations to partake on a simple platform which also promotes contributions in checklist creation, editing, and rating.
format Online
Article
Text
id pubmed-10621860
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-106218602023-11-03 Crowdsource authoring as a tool for enhancing the quality of competency assessments in healthcare professions Lin, Che-Wei Clinciu, Daniel L. Salcedo, Daniel Huang, Chih-Wei Kang, Enoch Yi No Li, Yu-Chuan (Jack) PLoS One Research Article The current Objective Structured Clinical Examination (OSCE) is complex, costly, and difficult to provide high-quality assessments. This pilot study employed a focus group and debugging stage to test the Crowdsource Authoring Assessment Tool (CAAT) for the creation and sharing of assessment tools used in editing and customizing, to match specific users’ needs, and to provide higher-quality checklists. Competency assessment international experts (n = 50) were asked to 1) participate in and experience the CAAT system when editing their own checklist, 2) edit a urinary catheterization checklist using CAAT, and 3) complete a Technology Acceptance Model (TAM) questionnaire consisting of 14 items to evaluate its four domains. The study occurred between October 2018 and May 2019. The median time for developing a new checklist using the CAAT was 65.76 minutes whereas the traditional method required 167.90 minutes. The CAAT system enabled quicker checklist creation and editing regardless of the experience and native language of participants. Participants also expressed the CAAT enhanced checklist development with 96% of them willing to recommend this tool to others. The use of a crowdsource authoring tool as revealed by this study has efficiently reduced the time to almost a third it would take when using the traditional method. In addition, it allows collaborations to partake on a simple platform which also promotes contributions in checklist creation, editing, and rating. Public Library of Science 2023-11-02 /pmc/articles/PMC10621860/ /pubmed/37917751 http://dx.doi.org/10.1371/journal.pone.0278571 Text en © 2023 Lin et al https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
spellingShingle Research Article
Lin, Che-Wei
Clinciu, Daniel L.
Salcedo, Daniel
Huang, Chih-Wei
Kang, Enoch Yi No
Li, Yu-Chuan (Jack)
Crowdsource authoring as a tool for enhancing the quality of competency assessments in healthcare professions
title Crowdsource authoring as a tool for enhancing the quality of competency assessments in healthcare professions
title_full Crowdsource authoring as a tool for enhancing the quality of competency assessments in healthcare professions
title_fullStr Crowdsource authoring as a tool for enhancing the quality of competency assessments in healthcare professions
title_full_unstemmed Crowdsource authoring as a tool for enhancing the quality of competency assessments in healthcare professions
title_short Crowdsource authoring as a tool for enhancing the quality of competency assessments in healthcare professions
title_sort crowdsource authoring as a tool for enhancing the quality of competency assessments in healthcare professions
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10621860/
https://www.ncbi.nlm.nih.gov/pubmed/37917751
http://dx.doi.org/10.1371/journal.pone.0278571
work_keys_str_mv AT linchewei crowdsourceauthoringasatoolforenhancingthequalityofcompetencyassessmentsinhealthcareprofessions
AT clinciudaniell crowdsourceauthoringasatoolforenhancingthequalityofcompetencyassessmentsinhealthcareprofessions
AT salcedodaniel crowdsourceauthoringasatoolforenhancingthequalityofcompetencyassessmentsinhealthcareprofessions
AT huangchihwei crowdsourceauthoringasatoolforenhancingthequalityofcompetencyassessmentsinhealthcareprofessions
AT kangenochyino crowdsourceauthoringasatoolforenhancingthequalityofcompetencyassessmentsinhealthcareprofessions
AT liyuchuanjack crowdsourceauthoringasatoolforenhancingthequalityofcompetencyassessmentsinhealthcareprofessions