Cargando…

Validity evidence for programmatic assessment in competency-based education

INTRODUCTION: Competency-based education (CBE) is now pervasive in health professions education. A foundational principle of CBE is to assess and identify the progression of competency development in students over time. It has been argued that a programmatic approach to assessment in CBE maximizes s...

Descripción completa

Detalles Bibliográficos
Autores principales: Bok, Harold G. J., de Jong, Lubberta H., O’Neill, Thomas, Maxey, Connor, Hecker, Kent G.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Bohn Stafleu van Loghum 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6283777/
https://www.ncbi.nlm.nih.gov/pubmed/30430439
http://dx.doi.org/10.1007/s40037-018-0481-2
_version_ 1783379214695661568
author Bok, Harold G. J.
de Jong, Lubberta H.
O’Neill, Thomas
Maxey, Connor
Hecker, Kent G.
author_facet Bok, Harold G. J.
de Jong, Lubberta H.
O’Neill, Thomas
Maxey, Connor
Hecker, Kent G.
author_sort Bok, Harold G. J.
collection PubMed
description INTRODUCTION: Competency-based education (CBE) is now pervasive in health professions education. A foundational principle of CBE is to assess and identify the progression of competency development in students over time. It has been argued that a programmatic approach to assessment in CBE maximizes student learning. The aim of this study is to investigate if programmatic assessment, i. e., a system of assessment, can be used within a CBE framework to track progression of student learning within and across competencies over time. METHODS: Three workplace-based assessment methods were used to measure the same seven competency domains. We performed a retrospective quantitative analysis of 327,974 assessment data points from 16,575 completed assessment forms from 962 students over 124 weeks using both descriptive (visualization) and modelling (inferential) analyses. This included multilevel random coefficient modelling and generalizability theory. RESULTS: Random coefficient modelling indicated that variance due to differences in inter-student performance was highest (40%). The reliability coefficients of scores from assessment methods ranged from 0.86 to 0.90. Method and competency variance components were in the small-to-moderate range. DISCUSSION: The current validation evidence provides cause for optimism regarding the explicit development and implementation of a program of assessment within CBE. The majority of the variance in scores appears to be student-related and reliable, supporting the psychometric properties as well as both formative and summative score applications.
format Online
Article
Text
id pubmed-6283777
institution National Center for Biotechnology Information
language English
publishDate 2018
publisher Bohn Stafleu van Loghum
record_format MEDLINE/PubMed
spelling pubmed-62837772018-12-26 Validity evidence for programmatic assessment in competency-based education Bok, Harold G. J. de Jong, Lubberta H. O’Neill, Thomas Maxey, Connor Hecker, Kent G. Perspect Med Educ Original Article INTRODUCTION: Competency-based education (CBE) is now pervasive in health professions education. A foundational principle of CBE is to assess and identify the progression of competency development in students over time. It has been argued that a programmatic approach to assessment in CBE maximizes student learning. The aim of this study is to investigate if programmatic assessment, i. e., a system of assessment, can be used within a CBE framework to track progression of student learning within and across competencies over time. METHODS: Three workplace-based assessment methods were used to measure the same seven competency domains. We performed a retrospective quantitative analysis of 327,974 assessment data points from 16,575 completed assessment forms from 962 students over 124 weeks using both descriptive (visualization) and modelling (inferential) analyses. This included multilevel random coefficient modelling and generalizability theory. RESULTS: Random coefficient modelling indicated that variance due to differences in inter-student performance was highest (40%). The reliability coefficients of scores from assessment methods ranged from 0.86 to 0.90. Method and competency variance components were in the small-to-moderate range. DISCUSSION: The current validation evidence provides cause for optimism regarding the explicit development and implementation of a program of assessment within CBE. The majority of the variance in scores appears to be student-related and reliable, supporting the psychometric properties as well as both formative and summative score applications. Bohn Stafleu van Loghum 2018-11-14 2018-12 /pmc/articles/PMC6283777/ /pubmed/30430439 http://dx.doi.org/10.1007/s40037-018-0481-2 Text en © The Author(s) 2018 Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.
spellingShingle Original Article
Bok, Harold G. J.
de Jong, Lubberta H.
O’Neill, Thomas
Maxey, Connor
Hecker, Kent G.
Validity evidence for programmatic assessment in competency-based education
title Validity evidence for programmatic assessment in competency-based education
title_full Validity evidence for programmatic assessment in competency-based education
title_fullStr Validity evidence for programmatic assessment in competency-based education
title_full_unstemmed Validity evidence for programmatic assessment in competency-based education
title_short Validity evidence for programmatic assessment in competency-based education
title_sort validity evidence for programmatic assessment in competency-based education
topic Original Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6283777/
https://www.ncbi.nlm.nih.gov/pubmed/30430439
http://dx.doi.org/10.1007/s40037-018-0481-2
work_keys_str_mv AT bokharoldgj validityevidenceforprogrammaticassessmentincompetencybasededucation
AT dejonglubbertah validityevidenceforprogrammaticassessmentincompetencybasededucation
AT oneillthomas validityevidenceforprogrammaticassessmentincompetencybasededucation
AT maxeyconnor validityevidenceforprogrammaticassessmentincompetencybasededucation
AT heckerkentg validityevidenceforprogrammaticassessmentincompetencybasededucation