Cargando…
A National Study of Longitudinal Consistency in ACGME Milestone Ratings by Clinical Competency Committees: Exploring an Aspect of Validity in the Assessment of Residents’ Competence
To investigate whether clinical competency committees (CCCs) were consistent in applying milestone ratings for first-year residents over time or whether ratings increased or decreased. METHOD: Beginning in December 2013, the Accreditation Council for Graduate Medical Education (ACGME) initiated a ph...
Autores principales: | , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Published for the Association of American Medical Colleges by Lippincott Williams & Wilkins
2019
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6760653/ https://www.ncbi.nlm.nih.gov/pubmed/31169540 http://dx.doi.org/10.1097/ACM.0000000000002820 |
_version_ | 1783453907099320320 |
---|---|
author | Hamstra, Stanley J. Yamazaki, Kenji Barton, Melissa A. Santen, Sally A. Beeson, Michael S. Holmboe, Eric S. |
author_facet | Hamstra, Stanley J. Yamazaki, Kenji Barton, Melissa A. Santen, Sally A. Beeson, Michael S. Holmboe, Eric S. |
author_sort | Hamstra, Stanley J. |
collection | PubMed |
description | To investigate whether clinical competency committees (CCCs) were consistent in applying milestone ratings for first-year residents over time or whether ratings increased or decreased. METHOD: Beginning in December 2013, the Accreditation Council for Graduate Medical Education (ACGME) initiated a phased-in requirement for reporting milestones; emergency medicine (EM), diagnostic radiology (DR), and urology (UR) were among the earliest reporting specialties. The authors analyzed CCC milestone ratings of first-year residents from 2013 to 2016 from all ACGME-accredited EM, DR, and UR programs for which they had data. The number of first-year residents in these programs ranged from 2,838 to 2,928 over this time period. The program-level average milestone rating for each subcompetency was regressed onto the time of observation using a random coefficient multilevel regression model. RESULTS: National average program-level milestone ratings of first-year residents decreased significantly over the observed time period for 32 of the 56 subcompetencies examined. None of the other subcompetencies showed a significant change. National average in-training examination scores for each of the specialties remained essentially unchanged over the time period, suggesting that differences between the cohorts were not likely an explanatory factor. CONCLUSIONS: The findings indicate that CCCs tend to become more stringent or maintain consistency in their ratings of beginning residents over time. One explanation for these results is that CCCs may become increasingly comfortable in assigning lower ratings when appropriate. This finding is consistent with an increase in confidence with the milestone rating process and the quality of feedback it provides. |
format | Online Article Text |
id | pubmed-6760653 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2019 |
publisher | Published for the Association of American Medical Colleges by Lippincott Williams & Wilkins |
record_format | MEDLINE/PubMed |
spelling | pubmed-67606532019-10-07 A National Study of Longitudinal Consistency in ACGME Milestone Ratings by Clinical Competency Committees: Exploring an Aspect of Validity in the Assessment of Residents’ Competence Hamstra, Stanley J. Yamazaki, Kenji Barton, Melissa A. Santen, Sally A. Beeson, Michael S. Holmboe, Eric S. Acad Med Research Reports To investigate whether clinical competency committees (CCCs) were consistent in applying milestone ratings for first-year residents over time or whether ratings increased or decreased. METHOD: Beginning in December 2013, the Accreditation Council for Graduate Medical Education (ACGME) initiated a phased-in requirement for reporting milestones; emergency medicine (EM), diagnostic radiology (DR), and urology (UR) were among the earliest reporting specialties. The authors analyzed CCC milestone ratings of first-year residents from 2013 to 2016 from all ACGME-accredited EM, DR, and UR programs for which they had data. The number of first-year residents in these programs ranged from 2,838 to 2,928 over this time period. The program-level average milestone rating for each subcompetency was regressed onto the time of observation using a random coefficient multilevel regression model. RESULTS: National average program-level milestone ratings of first-year residents decreased significantly over the observed time period for 32 of the 56 subcompetencies examined. None of the other subcompetencies showed a significant change. National average in-training examination scores for each of the specialties remained essentially unchanged over the time period, suggesting that differences between the cohorts were not likely an explanatory factor. CONCLUSIONS: The findings indicate that CCCs tend to become more stringent or maintain consistency in their ratings of beginning residents over time. One explanation for these results is that CCCs may become increasingly comfortable in assigning lower ratings when appropriate. This finding is consistent with an increase in confidence with the milestone rating process and the quality of feedback it provides. Published for the Association of American Medical Colleges by Lippincott Williams & Wilkins 2019-10 2019-06-04 /pmc/articles/PMC6760653/ /pubmed/31169540 http://dx.doi.org/10.1097/ACM.0000000000002820 Text en Copyright © 2019 The Author(s). Published by Wolters Kluwer Health, Inc. on behalf of the Association of American Medical Colleges. This is an open-access article distributed under the terms of the Creative Commons Attribution-Non Commercial-No Derivatives License 4.0 (CCBY-NC-ND) (http://creativecommons.org/licenses/by-nc-nd/4.0/) , where it is permissible to download and share the work provided it is properly cited. The work cannot be changed in any way or used commercially without permission from the journal. |
spellingShingle | Research Reports Hamstra, Stanley J. Yamazaki, Kenji Barton, Melissa A. Santen, Sally A. Beeson, Michael S. Holmboe, Eric S. A National Study of Longitudinal Consistency in ACGME Milestone Ratings by Clinical Competency Committees: Exploring an Aspect of Validity in the Assessment of Residents’ Competence |
title | A National Study of Longitudinal Consistency in ACGME Milestone Ratings by Clinical Competency Committees: Exploring an Aspect of Validity in the Assessment of Residents’ Competence |
title_full | A National Study of Longitudinal Consistency in ACGME Milestone Ratings by Clinical Competency Committees: Exploring an Aspect of Validity in the Assessment of Residents’ Competence |
title_fullStr | A National Study of Longitudinal Consistency in ACGME Milestone Ratings by Clinical Competency Committees: Exploring an Aspect of Validity in the Assessment of Residents’ Competence |
title_full_unstemmed | A National Study of Longitudinal Consistency in ACGME Milestone Ratings by Clinical Competency Committees: Exploring an Aspect of Validity in the Assessment of Residents’ Competence |
title_short | A National Study of Longitudinal Consistency in ACGME Milestone Ratings by Clinical Competency Committees: Exploring an Aspect of Validity in the Assessment of Residents’ Competence |
title_sort | national study of longitudinal consistency in acgme milestone ratings by clinical competency committees: exploring an aspect of validity in the assessment of residents’ competence |
topic | Research Reports |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6760653/ https://www.ncbi.nlm.nih.gov/pubmed/31169540 http://dx.doi.org/10.1097/ACM.0000000000002820 |
work_keys_str_mv | AT hamstrastanleyj anationalstudyoflongitudinalconsistencyinacgmemilestoneratingsbyclinicalcompetencycommitteesexploringanaspectofvalidityintheassessmentofresidentscompetence AT yamazakikenji anationalstudyoflongitudinalconsistencyinacgmemilestoneratingsbyclinicalcompetencycommitteesexploringanaspectofvalidityintheassessmentofresidentscompetence AT bartonmelissaa anationalstudyoflongitudinalconsistencyinacgmemilestoneratingsbyclinicalcompetencycommitteesexploringanaspectofvalidityintheassessmentofresidentscompetence AT santensallya anationalstudyoflongitudinalconsistencyinacgmemilestoneratingsbyclinicalcompetencycommitteesexploringanaspectofvalidityintheassessmentofresidentscompetence AT beesonmichaels anationalstudyoflongitudinalconsistencyinacgmemilestoneratingsbyclinicalcompetencycommitteesexploringanaspectofvalidityintheassessmentofresidentscompetence AT holmboeerics anationalstudyoflongitudinalconsistencyinacgmemilestoneratingsbyclinicalcompetencycommitteesexploringanaspectofvalidityintheassessmentofresidentscompetence AT hamstrastanleyj nationalstudyoflongitudinalconsistencyinacgmemilestoneratingsbyclinicalcompetencycommitteesexploringanaspectofvalidityintheassessmentofresidentscompetence AT yamazakikenji nationalstudyoflongitudinalconsistencyinacgmemilestoneratingsbyclinicalcompetencycommitteesexploringanaspectofvalidityintheassessmentofresidentscompetence AT bartonmelissaa nationalstudyoflongitudinalconsistencyinacgmemilestoneratingsbyclinicalcompetencycommitteesexploringanaspectofvalidityintheassessmentofresidentscompetence AT santensallya nationalstudyoflongitudinalconsistencyinacgmemilestoneratingsbyclinicalcompetencycommitteesexploringanaspectofvalidityintheassessmentofresidentscompetence AT beesonmichaels nationalstudyoflongitudinalconsistencyinacgmemilestoneratingsbyclinicalcompetencycommitteesexploringanaspectofvalidityintheassessmentofresidentscompetence AT holmboeerics nationalstudyoflongitudinalconsistencyinacgmemilestoneratingsbyclinicalcompetencycommitteesexploringanaspectofvalidityintheassessmentofresidentscompetence |