Cargando…
Developing, evaluating and validating a scoring rubric for written case reports
OBJECTIVES: The purpose of this study was to evaluate Family Medicine Clerkship students’ writing skills using an anchored scoring rubric. In this study, we report on the assessment of a current scoring rubric (SR) used to grade written case description papers (CDP) for medical students, describe th...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
IJME
2014
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4207174/ https://www.ncbi.nlm.nih.gov/pubmed/25341207 http://dx.doi.org/10.5116/ijme.52c6.d7ef |
_version_ | 1782340930581299200 |
---|---|
author | Cyr, Peggy R. Smith, Kahsi A. Broyles, India L. Holt, Christina T. |
author_facet | Cyr, Peggy R. Smith, Kahsi A. Broyles, India L. Holt, Christina T. |
author_sort | Cyr, Peggy R. |
collection | PubMed |
description | OBJECTIVES: The purpose of this study was to evaluate Family Medicine Clerkship students’ writing skills using an anchored scoring rubric. In this study, we report on the assessment of a current scoring rubric (SR) used to grade written case description papers (CDP) for medical students, describe the development of a revised SR with examination of scoring consistency among faculty raters, and report on feedback from students regarding SR revisions and written CDP. METHODS: Five faculty members scored a total of eighty-three written CDP using both the Original SR (OSR) and the Revised SR1 (RSR1) during the 2009-2010 academic years. RESULTS: Overall increased faculty inter-rater reliability was obtained using the RSR1. Additionally, this subset analysis revealed that the five faculty using the Revised SR2 (RSR2) had a high measure of inter-rater reliability on their scoring of this subset of papers (as measured by intra-class correlation (ICC) with ICC = 0.93, p < 0.001. CONCLUSIONS: Findings from this research have implications for medical education, by highlighting the importance of the assessment and development of reliable evaluation tools for medical student writing projects. |
format | Online Article Text |
id | pubmed-4207174 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2014 |
publisher | IJME |
record_format | MEDLINE/PubMed |
spelling | pubmed-42071742014-10-23 Developing, evaluating and validating a scoring rubric for written case reports Cyr, Peggy R. Smith, Kahsi A. Broyles, India L. Holt, Christina T. Int J Med Educ Research Article OBJECTIVES: The purpose of this study was to evaluate Family Medicine Clerkship students’ writing skills using an anchored scoring rubric. In this study, we report on the assessment of a current scoring rubric (SR) used to grade written case description papers (CDP) for medical students, describe the development of a revised SR with examination of scoring consistency among faculty raters, and report on feedback from students regarding SR revisions and written CDP. METHODS: Five faculty members scored a total of eighty-three written CDP using both the Original SR (OSR) and the Revised SR1 (RSR1) during the 2009-2010 academic years. RESULTS: Overall increased faculty inter-rater reliability was obtained using the RSR1. Additionally, this subset analysis revealed that the five faculty using the Revised SR2 (RSR2) had a high measure of inter-rater reliability on their scoring of this subset of papers (as measured by intra-class correlation (ICC) with ICC = 0.93, p < 0.001. CONCLUSIONS: Findings from this research have implications for medical education, by highlighting the importance of the assessment and development of reliable evaluation tools for medical student writing projects. IJME 2014-02-01 /pmc/articles/PMC4207174/ /pubmed/25341207 http://dx.doi.org/10.5116/ijme.52c6.d7ef Text en Copyright: © 2014 Peggy R. Cyr et al. http://creativecommons.org/licenses/by/3.0 This is an Open Access article distributed under the terms of the Creative Commons Attribution License which permits unrestricted use of work provided the original work is properly cited. http://creativecommons.org/licenses/by/3.0/ |
spellingShingle | Research Article Cyr, Peggy R. Smith, Kahsi A. Broyles, India L. Holt, Christina T. Developing, evaluating and validating a scoring rubric for written case reports |
title | Developing, evaluating and validating a scoring rubric for written case reports |
title_full | Developing, evaluating and validating a scoring rubric for written case reports |
title_fullStr | Developing, evaluating and validating a scoring rubric for written case reports |
title_full_unstemmed | Developing, evaluating and validating a scoring rubric for written case reports |
title_short | Developing, evaluating and validating a scoring rubric for written case reports |
title_sort | developing, evaluating and validating a scoring rubric for written case reports |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4207174/ https://www.ncbi.nlm.nih.gov/pubmed/25341207 http://dx.doi.org/10.5116/ijme.52c6.d7ef |
work_keys_str_mv | AT cyrpeggyr developingevaluatingandvalidatingascoringrubricforwrittencasereports AT smithkahsia developingevaluatingandvalidatingascoringrubricforwrittencasereports AT broylesindial developingevaluatingandvalidatingascoringrubricforwrittencasereports AT holtchristinat developingevaluatingandvalidatingascoringrubricforwrittencasereports |