Cargando…
Using video-based examiner score comparison and adjustment (VESCA) to compare the influence of examiners at different sites in a distributed objective structured clinical exam (OSCE)
PURPOSE: Ensuring equivalence of examiners’ judgements within distributed objective structured clinical exams (OSCEs) is key to both fairness and validity but is hampered by lack of cross-over in the performances which different groups of examiners observe. This study develops a novel method called...
Autores principales: | , , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
BioMed Central
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10605484/ https://www.ncbi.nlm.nih.gov/pubmed/37885005 http://dx.doi.org/10.1186/s12909-023-04774-4 |
_version_ | 1785127085481656320 |
---|---|
author | Yeates, Peter Maluf, Adriano Cope, Natalie McCray, Gareth McBain, Stuart Beardow, Dominic Fuller, Richard McKinley, Robert Bob |
author_facet | Yeates, Peter Maluf, Adriano Cope, Natalie McCray, Gareth McBain, Stuart Beardow, Dominic Fuller, Richard McKinley, Robert Bob |
author_sort | Yeates, Peter |
collection | PubMed |
description | PURPOSE: Ensuring equivalence of examiners’ judgements within distributed objective structured clinical exams (OSCEs) is key to both fairness and validity but is hampered by lack of cross-over in the performances which different groups of examiners observe. This study develops a novel method called Video-based Examiner Score Comparison and Adjustment (VESCA) using it to compare examiners scoring from different OSCE sites for the first time. MATERIALS/ METHODS: Within a summative 16 station OSCE, volunteer students were videoed on each station and all examiners invited to score station-specific comparator videos in addition to usual student scoring. Linkage provided through the video-scores enabled use of Many Facet Rasch Modelling (MFRM) to compare 1/ examiner-cohort and 2/ site effects on students’ scores. RESULTS: Examiner-cohorts varied by 6.9% in the overall score allocated to students of the same ability. Whilst only a tiny difference was apparent between sites, examiner-cohort variability was greater in one site than the other. Adjusting student scores produced a median change in rank position of 6 places (0.48 deciles), however 26.9% of students changed their rank position by at least 1 decile. By contrast, only 1 student’s pass/fail classification was altered by score adjustment. CONCLUSIONS: Whilst comparatively limited examiner participation rates may limit interpretation of score adjustment in this instance, this study demonstrates the feasibility of using VESCA for quality assurance purposes in large scale distributed OSCEs. |
format | Online Article Text |
id | pubmed-10605484 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | BioMed Central |
record_format | MEDLINE/PubMed |
spelling | pubmed-106054842023-10-28 Using video-based examiner score comparison and adjustment (VESCA) to compare the influence of examiners at different sites in a distributed objective structured clinical exam (OSCE) Yeates, Peter Maluf, Adriano Cope, Natalie McCray, Gareth McBain, Stuart Beardow, Dominic Fuller, Richard McKinley, Robert Bob BMC Med Educ Research PURPOSE: Ensuring equivalence of examiners’ judgements within distributed objective structured clinical exams (OSCEs) is key to both fairness and validity but is hampered by lack of cross-over in the performances which different groups of examiners observe. This study develops a novel method called Video-based Examiner Score Comparison and Adjustment (VESCA) using it to compare examiners scoring from different OSCE sites for the first time. MATERIALS/ METHODS: Within a summative 16 station OSCE, volunteer students were videoed on each station and all examiners invited to score station-specific comparator videos in addition to usual student scoring. Linkage provided through the video-scores enabled use of Many Facet Rasch Modelling (MFRM) to compare 1/ examiner-cohort and 2/ site effects on students’ scores. RESULTS: Examiner-cohorts varied by 6.9% in the overall score allocated to students of the same ability. Whilst only a tiny difference was apparent between sites, examiner-cohort variability was greater in one site than the other. Adjusting student scores produced a median change in rank position of 6 places (0.48 deciles), however 26.9% of students changed their rank position by at least 1 decile. By contrast, only 1 student’s pass/fail classification was altered by score adjustment. CONCLUSIONS: Whilst comparatively limited examiner participation rates may limit interpretation of score adjustment in this instance, this study demonstrates the feasibility of using VESCA for quality assurance purposes in large scale distributed OSCEs. BioMed Central 2023-10-26 /pmc/articles/PMC10605484/ /pubmed/37885005 http://dx.doi.org/10.1186/s12909-023-04774-4 Text en © The Author(s) 2023 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/ (https://creativecommons.org/publicdomain/zero/1.0/) ) applies to the data made available in this article, unless otherwise stated in a credit line to the data. |
spellingShingle | Research Yeates, Peter Maluf, Adriano Cope, Natalie McCray, Gareth McBain, Stuart Beardow, Dominic Fuller, Richard McKinley, Robert Bob Using video-based examiner score comparison and adjustment (VESCA) to compare the influence of examiners at different sites in a distributed objective structured clinical exam (OSCE) |
title | Using video-based examiner score comparison and adjustment (VESCA) to compare the influence of examiners at different sites in a distributed objective structured clinical exam (OSCE) |
title_full | Using video-based examiner score comparison and adjustment (VESCA) to compare the influence of examiners at different sites in a distributed objective structured clinical exam (OSCE) |
title_fullStr | Using video-based examiner score comparison and adjustment (VESCA) to compare the influence of examiners at different sites in a distributed objective structured clinical exam (OSCE) |
title_full_unstemmed | Using video-based examiner score comparison and adjustment (VESCA) to compare the influence of examiners at different sites in a distributed objective structured clinical exam (OSCE) |
title_short | Using video-based examiner score comparison and adjustment (VESCA) to compare the influence of examiners at different sites in a distributed objective structured clinical exam (OSCE) |
title_sort | using video-based examiner score comparison and adjustment (vesca) to compare the influence of examiners at different sites in a distributed objective structured clinical exam (osce) |
topic | Research |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10605484/ https://www.ncbi.nlm.nih.gov/pubmed/37885005 http://dx.doi.org/10.1186/s12909-023-04774-4 |
work_keys_str_mv | AT yeatespeter usingvideobasedexaminerscorecomparisonandadjustmentvescatocomparetheinfluenceofexaminersatdifferentsitesinadistributedobjectivestructuredclinicalexamosce AT malufadriano usingvideobasedexaminerscorecomparisonandadjustmentvescatocomparetheinfluenceofexaminersatdifferentsitesinadistributedobjectivestructuredclinicalexamosce AT copenatalie usingvideobasedexaminerscorecomparisonandadjustmentvescatocomparetheinfluenceofexaminersatdifferentsitesinadistributedobjectivestructuredclinicalexamosce AT mccraygareth usingvideobasedexaminerscorecomparisonandadjustmentvescatocomparetheinfluenceofexaminersatdifferentsitesinadistributedobjectivestructuredclinicalexamosce AT mcbainstuart usingvideobasedexaminerscorecomparisonandadjustmentvescatocomparetheinfluenceofexaminersatdifferentsitesinadistributedobjectivestructuredclinicalexamosce AT beardowdominic usingvideobasedexaminerscorecomparisonandadjustmentvescatocomparetheinfluenceofexaminersatdifferentsitesinadistributedobjectivestructuredclinicalexamosce AT fullerrichard usingvideobasedexaminerscorecomparisonandadjustmentvescatocomparetheinfluenceofexaminersatdifferentsitesinadistributedobjectivestructuredclinicalexamosce AT mckinleyrobertbob usingvideobasedexaminerscorecomparisonandadjustmentvescatocomparetheinfluenceofexaminersatdifferentsitesinadistributedobjectivestructuredclinicalexamosce |