Cargando…

Derivation of Two Critical Appraisal Scores for Trainees to Evaluate Online Educational Resources: A METRIQ Study

INTRODUCTION: Online education resources (OERs), like blogs and podcasts, increasingly augment or replace traditional medical education resources such as textbooks and lectures. Trainees’ ability to evaluate these resources is poor, and few quality assessment aids have been developed to assist them....

Descripción completa

Detalles Bibliográficos
Autores principales: Chan, Teresa M., Thoma, Brent, Krishnan, Keeth, Lin, Michelle, Carpenter, Christopher R., Astin, Matt, Kulasegaram, Kulamakan
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Department of Emergency Medicine, University of California, Irvine School of Medicine 2016
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5017842/
https://www.ncbi.nlm.nih.gov/pubmed/27625722
http://dx.doi.org/10.5811/westjem.2016.6.30825
_version_ 1782452832682639360
author Chan, Teresa M.
Thoma, Brent
Krishnan, Keeth
Lin, Michelle
Carpenter, Christopher R.
Astin, Matt
Kulasegaram, Kulamakan
author_facet Chan, Teresa M.
Thoma, Brent
Krishnan, Keeth
Lin, Michelle
Carpenter, Christopher R.
Astin, Matt
Kulasegaram, Kulamakan
author_sort Chan, Teresa M.
collection PubMed
description INTRODUCTION: Online education resources (OERs), like blogs and podcasts, increasingly augment or replace traditional medical education resources such as textbooks and lectures. Trainees’ ability to evaluate these resources is poor, and few quality assessment aids have been developed to assist them. This study aimed to derive a quality evaluation instrument for this purpose. METHODS: We used a three-phase methodology. In Phase 1, a previously derived list of 151 OER quality indicators was reduced to 13 items using data from published consensus-building studies (of medical educators, expert podcasters, and expert bloggers) and subsequent evaluation by our team. In Phase 2, these 13 items were converted to seven-point Likert scales used by trainee raters (n=40) to evaluate 39 OERs. The reliability and usability of these 13 rating items was determined using responses from trainee raters, and top items were used to create two OER quality evaluation instruments. In Phase 3, these instruments were compared to an external certification process (the ALiEM AIR certification) and the gestalt evaluation of the same 39 blog posts by 20 faculty educators. RESULTS: Two quality-evaluation instruments were derived with fair inter-rater reliability: the METRIQ-8 Score (Inter class correlation coefficient [ICC]=0.30, p<0.001) and the METRIQ-5 Score (ICC=0.22, p<0.001). Both scores, when calculated using the derivation data, correlated with educator gestalt (Pearson’s r=0.35, p=0.03 and r=0.41, p<0.01, respectively) and were related to increased odds of receiving an ALiEM AIR certification (odds ratio=1.28, p=0.03; OR=1.5, p=0.004, respectively). CONCLUSION: Two novel scoring instruments with adequate psychometric properties were derived to assist trainees in evaluating OER quality and correlated favourably with gestalt ratings of online educational resources by faculty educators. Further testing is needed to ensure these instruments are accurate when applied by trainees.
format Online
Article
Text
id pubmed-5017842
institution National Center for Biotechnology Information
language English
publishDate 2016
publisher Department of Emergency Medicine, University of California, Irvine School of Medicine
record_format MEDLINE/PubMed
spelling pubmed-50178422016-09-13 Derivation of Two Critical Appraisal Scores for Trainees to Evaluate Online Educational Resources: A METRIQ Study Chan, Teresa M. Thoma, Brent Krishnan, Keeth Lin, Michelle Carpenter, Christopher R. Astin, Matt Kulasegaram, Kulamakan West J Emerg Med Education INTRODUCTION: Online education resources (OERs), like blogs and podcasts, increasingly augment or replace traditional medical education resources such as textbooks and lectures. Trainees’ ability to evaluate these resources is poor, and few quality assessment aids have been developed to assist them. This study aimed to derive a quality evaluation instrument for this purpose. METHODS: We used a three-phase methodology. In Phase 1, a previously derived list of 151 OER quality indicators was reduced to 13 items using data from published consensus-building studies (of medical educators, expert podcasters, and expert bloggers) and subsequent evaluation by our team. In Phase 2, these 13 items were converted to seven-point Likert scales used by trainee raters (n=40) to evaluate 39 OERs. The reliability and usability of these 13 rating items was determined using responses from trainee raters, and top items were used to create two OER quality evaluation instruments. In Phase 3, these instruments were compared to an external certification process (the ALiEM AIR certification) and the gestalt evaluation of the same 39 blog posts by 20 faculty educators. RESULTS: Two quality-evaluation instruments were derived with fair inter-rater reliability: the METRIQ-8 Score (Inter class correlation coefficient [ICC]=0.30, p<0.001) and the METRIQ-5 Score (ICC=0.22, p<0.001). Both scores, when calculated using the derivation data, correlated with educator gestalt (Pearson’s r=0.35, p=0.03 and r=0.41, p<0.01, respectively) and were related to increased odds of receiving an ALiEM AIR certification (odds ratio=1.28, p=0.03; OR=1.5, p=0.004, respectively). CONCLUSION: Two novel scoring instruments with adequate psychometric properties were derived to assist trainees in evaluating OER quality and correlated favourably with gestalt ratings of online educational resources by faculty educators. Further testing is needed to ensure these instruments are accurate when applied by trainees. Department of Emergency Medicine, University of California, Irvine School of Medicine 2016-09 2016-07-26 /pmc/articles/PMC5017842/ /pubmed/27625722 http://dx.doi.org/10.5811/westjem.2016.6.30825 Text en © 2016 Chan et al. http://creativecommons.org/licenses/by/4.0/ This is an open access article distributed in accordance with the terms of the Creative Commons Attribution (CC BY 4.0) License. See: http://creativecommons.org/licenses/by/4.0/
spellingShingle Education
Chan, Teresa M.
Thoma, Brent
Krishnan, Keeth
Lin, Michelle
Carpenter, Christopher R.
Astin, Matt
Kulasegaram, Kulamakan
Derivation of Two Critical Appraisal Scores for Trainees to Evaluate Online Educational Resources: A METRIQ Study
title Derivation of Two Critical Appraisal Scores for Trainees to Evaluate Online Educational Resources: A METRIQ Study
title_full Derivation of Two Critical Appraisal Scores for Trainees to Evaluate Online Educational Resources: A METRIQ Study
title_fullStr Derivation of Two Critical Appraisal Scores for Trainees to Evaluate Online Educational Resources: A METRIQ Study
title_full_unstemmed Derivation of Two Critical Appraisal Scores for Trainees to Evaluate Online Educational Resources: A METRIQ Study
title_short Derivation of Two Critical Appraisal Scores for Trainees to Evaluate Online Educational Resources: A METRIQ Study
title_sort derivation of two critical appraisal scores for trainees to evaluate online educational resources: a metriq study
topic Education
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5017842/
https://www.ncbi.nlm.nih.gov/pubmed/27625722
http://dx.doi.org/10.5811/westjem.2016.6.30825
work_keys_str_mv AT chanteresam derivationoftwocriticalappraisalscoresfortraineestoevaluateonlineeducationalresourcesametriqstudy
AT thomabrent derivationoftwocriticalappraisalscoresfortraineestoevaluateonlineeducationalresourcesametriqstudy
AT krishnankeeth derivationoftwocriticalappraisalscoresfortraineestoevaluateonlineeducationalresourcesametriqstudy
AT linmichelle derivationoftwocriticalappraisalscoresfortraineestoevaluateonlineeducationalresourcesametriqstudy
AT carpenterchristopherr derivationoftwocriticalappraisalscoresfortraineestoevaluateonlineeducationalresourcesametriqstudy
AT astinmatt derivationoftwocriticalappraisalscoresfortraineestoevaluateonlineeducationalresourcesametriqstudy
AT kulasegaramkulamakan derivationoftwocriticalappraisalscoresfortraineestoevaluateonlineeducationalresourcesametriqstudy