Cargando…

Exploring assessor cognition as a source of score variability in a performance assessment of practice-based competencies

BACKGROUND: A common feature of performance assessments is the use of human assessors to render judgements on student performance. From a measurement perspective, variability among assessors when assessing students may be viewed as a concern because it negatively impacts score reliability and validi...

Descripción completa

Detalles Bibliográficos
Autores principales: Roduta Roberts, Mary, Cook, Megan, Chao, Iris C. I.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: BioMed Central 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7249646/
https://www.ncbi.nlm.nih.gov/pubmed/32450851
http://dx.doi.org/10.1186/s12909-020-02077-6
_version_ 1783538628486496256
author Roduta Roberts, Mary
Cook, Megan
Chao, Iris C. I.
author_facet Roduta Roberts, Mary
Cook, Megan
Chao, Iris C. I.
author_sort Roduta Roberts, Mary
collection PubMed
description BACKGROUND: A common feature of performance assessments is the use of human assessors to render judgements on student performance. From a measurement perspective, variability among assessors when assessing students may be viewed as a concern because it negatively impacts score reliability and validity. However, from a contextual perspective, variability among assessors is considered both meaningful and expected. A qualitative examination of assessor cognition when assessing student performance can assist in exploring what components are amenable to improvement through enhanced rater training, and the extent of variability when viewing assessors as contributing their individual expertise. Therefore, the purpose of this study was to explore assessor cognition as a source of score variability in a performance assessment of practice-based competencies. METHOD: A mixed-method sequential explanatory study design was used where findings from the qualitative strand assisted in the interpretation of results from the quantitative strand. Scores from one objective structured clinical examination (OSCE) were obtained for 95 occupational therapy students. Two Generalizability studies were conducted to examine the relative contribution of assessors as a source of score variability and to estimate the reliability of domain and holistic scores. Think-aloud interviews were conducted with eight participants assessing a subset of student performances from the OSCE in which they participated. Findings from the analysis of think-aloud data and consideration of assessors’ background characteristics were used to assist in the interpretation of variance component estimates involving assessors, and score reliability. RESULTS: Results from two generalizability analyses indicated the highest-order interaction-error term involving assessors accounted for the second-highest proportion of variance, after student variation. Score reliability was higher in the holistic vs. analytic scoring framework. Verbal analysis of assessors' think-aloud interviews provided evidential support for the quantitative results. CONCLUSIONS: This study provides insight into the nature and extent of assessor variability during a performance assessment of practice-based competencies. Study findings are interpretable from the measurement and contextual perspectives on assessor cognition. An integrated understanding is important to elucidate the meaning underlying the numerical score because the defensibility of inferences made about students’ proficiencies rely on score quality, which in turn relies on expert judgements.
format Online
Article
Text
id pubmed-7249646
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher BioMed Central
record_format MEDLINE/PubMed
spelling pubmed-72496462020-06-04 Exploring assessor cognition as a source of score variability in a performance assessment of practice-based competencies Roduta Roberts, Mary Cook, Megan Chao, Iris C. I. BMC Med Educ Research Article BACKGROUND: A common feature of performance assessments is the use of human assessors to render judgements on student performance. From a measurement perspective, variability among assessors when assessing students may be viewed as a concern because it negatively impacts score reliability and validity. However, from a contextual perspective, variability among assessors is considered both meaningful and expected. A qualitative examination of assessor cognition when assessing student performance can assist in exploring what components are amenable to improvement through enhanced rater training, and the extent of variability when viewing assessors as contributing their individual expertise. Therefore, the purpose of this study was to explore assessor cognition as a source of score variability in a performance assessment of practice-based competencies. METHOD: A mixed-method sequential explanatory study design was used where findings from the qualitative strand assisted in the interpretation of results from the quantitative strand. Scores from one objective structured clinical examination (OSCE) were obtained for 95 occupational therapy students. Two Generalizability studies were conducted to examine the relative contribution of assessors as a source of score variability and to estimate the reliability of domain and holistic scores. Think-aloud interviews were conducted with eight participants assessing a subset of student performances from the OSCE in which they participated. Findings from the analysis of think-aloud data and consideration of assessors’ background characteristics were used to assist in the interpretation of variance component estimates involving assessors, and score reliability. RESULTS: Results from two generalizability analyses indicated the highest-order interaction-error term involving assessors accounted for the second-highest proportion of variance, after student variation. Score reliability was higher in the holistic vs. analytic scoring framework. Verbal analysis of assessors' think-aloud interviews provided evidential support for the quantitative results. CONCLUSIONS: This study provides insight into the nature and extent of assessor variability during a performance assessment of practice-based competencies. Study findings are interpretable from the measurement and contextual perspectives on assessor cognition. An integrated understanding is important to elucidate the meaning underlying the numerical score because the defensibility of inferences made about students’ proficiencies rely on score quality, which in turn relies on expert judgements. BioMed Central 2020-05-25 /pmc/articles/PMC7249646/ /pubmed/32450851 http://dx.doi.org/10.1186/s12909-020-02077-6 Text en © The Author(s) 2020 Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.
spellingShingle Research Article
Roduta Roberts, Mary
Cook, Megan
Chao, Iris C. I.
Exploring assessor cognition as a source of score variability in a performance assessment of practice-based competencies
title Exploring assessor cognition as a source of score variability in a performance assessment of practice-based competencies
title_full Exploring assessor cognition as a source of score variability in a performance assessment of practice-based competencies
title_fullStr Exploring assessor cognition as a source of score variability in a performance assessment of practice-based competencies
title_full_unstemmed Exploring assessor cognition as a source of score variability in a performance assessment of practice-based competencies
title_short Exploring assessor cognition as a source of score variability in a performance assessment of practice-based competencies
title_sort exploring assessor cognition as a source of score variability in a performance assessment of practice-based competencies
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7249646/
https://www.ncbi.nlm.nih.gov/pubmed/32450851
http://dx.doi.org/10.1186/s12909-020-02077-6
work_keys_str_mv AT rodutarobertsmary exploringassessorcognitionasasourceofscorevariabilityinaperformanceassessmentofpracticebasedcompetencies
AT cookmegan exploringassessorcognitionasasourceofscorevariabilityinaperformanceassessmentofpracticebasedcompetencies
AT chaoirisci exploringassessorcognitionasasourceofscorevariabilityinaperformanceassessmentofpracticebasedcompetencies