Cargando…

Validity of a cardiology fellow performance assessment: reliability and associations with standardized examinations and awards

BACKGROUND: Most work on the validity of clinical assessments for measuring learner performance in graduate medical education has occurred at the residency level. Minimal research exists on the validity of clinical assessments for measuring learner performance in advanced subspecialties. We sought t...

Descripción completa

Detalles Bibliográficos
Autores principales: Cullen, Michael W., Klarich, Kyle W., Baldwin, Kristine M., Engstler, Gregory J., Mandrekar, Jay, Scott, Christopher G., Beckman, Thomas J.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: BioMed Central 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8925146/
https://www.ncbi.nlm.nih.gov/pubmed/35291995
http://dx.doi.org/10.1186/s12909-022-03239-4
_version_ 1784670007998808064
author Cullen, Michael W.
Klarich, Kyle W.
Baldwin, Kristine M.
Engstler, Gregory J.
Mandrekar, Jay
Scott, Christopher G.
Beckman, Thomas J.
author_facet Cullen, Michael W.
Klarich, Kyle W.
Baldwin, Kristine M.
Engstler, Gregory J.
Mandrekar, Jay
Scott, Christopher G.
Beckman, Thomas J.
author_sort Cullen, Michael W.
collection PubMed
description BACKGROUND: Most work on the validity of clinical assessments for measuring learner performance in graduate medical education has occurred at the residency level. Minimal research exists on the validity of clinical assessments for measuring learner performance in advanced subspecialties. We sought to determine validity characteristics of cardiology fellows’ assessment scores during subspecialty training, which represents the largest subspecialty of internal medicine. Validity evidence included item content, internal consistency reliability, and associations between faculty-of-fellow clinical assessments and other pertinent variables. METHODS: This was a retrospective validation study exploring the domains of content, internal structure, and relations to other variables validity evidence for scores on faculty-of-fellow clinical assessments that include the 10-item Mayo Cardiology Fellows Assessment (MCFA-10). Participants included 7 cardiology fellowship classes. The MCFA-10 item content included questions previously validated in the assessment of internal medicine residents. Internal structure evidence was assessed through Cronbach’s α. The outcome for relations to other variables evidence was overall mean of faculty-of-fellow assessment score (scale 1–5). Independent variables included common measures of fellow performance. FINDINGS: Participants included 65 cardiology fellows. The overall mean ± standard deviation faculty-of-fellow assessment score was 4.07 ± 0.18. Content evidence for the MCFA-10 scores was based on published literature and core competencies. Cronbach’s α was 0.98, suggesting high internal consistency reliability and offering evidence for internal structure validity. In multivariable analysis to provide relations to other variables evidence, mean assessment scores were independently associated with in-training examination scores (beta = 0.088 per 10-point increase; p = 0.05) and receiving a departmental or institutional award (beta = 0.152; p = 0.001). Assessment scores were not associated with educational conference attendance, compliance with completion of required evaluations, faculty appointment upon completion of training, or performance on the board certification exam. R(2) for the multivariable model was 0.25. CONCLUSIONS: These findings provide sound validity evidence establishing item content, internal consistency reliability, and associations with other variables for faculty-of-fellow clinical assessment scores that include MCFA-10 items during cardiology fellowship. Relations to other variables evidence included associations of assessment scores with performance on the in-training examination and receipt of competitive awards. These data support the utility of the MCFA-10 as a measure of performance during cardiology training and could serve as the foundation for future research on the assessment of subspecialty learners.
format Online
Article
Text
id pubmed-8925146
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher BioMed Central
record_format MEDLINE/PubMed
spelling pubmed-89251462022-03-23 Validity of a cardiology fellow performance assessment: reliability and associations with standardized examinations and awards Cullen, Michael W. Klarich, Kyle W. Baldwin, Kristine M. Engstler, Gregory J. Mandrekar, Jay Scott, Christopher G. Beckman, Thomas J. BMC Med Educ Research BACKGROUND: Most work on the validity of clinical assessments for measuring learner performance in graduate medical education has occurred at the residency level. Minimal research exists on the validity of clinical assessments for measuring learner performance in advanced subspecialties. We sought to determine validity characteristics of cardiology fellows’ assessment scores during subspecialty training, which represents the largest subspecialty of internal medicine. Validity evidence included item content, internal consistency reliability, and associations between faculty-of-fellow clinical assessments and other pertinent variables. METHODS: This was a retrospective validation study exploring the domains of content, internal structure, and relations to other variables validity evidence for scores on faculty-of-fellow clinical assessments that include the 10-item Mayo Cardiology Fellows Assessment (MCFA-10). Participants included 7 cardiology fellowship classes. The MCFA-10 item content included questions previously validated in the assessment of internal medicine residents. Internal structure evidence was assessed through Cronbach’s α. The outcome for relations to other variables evidence was overall mean of faculty-of-fellow assessment score (scale 1–5). Independent variables included common measures of fellow performance. FINDINGS: Participants included 65 cardiology fellows. The overall mean ± standard deviation faculty-of-fellow assessment score was 4.07 ± 0.18. Content evidence for the MCFA-10 scores was based on published literature and core competencies. Cronbach’s α was 0.98, suggesting high internal consistency reliability and offering evidence for internal structure validity. In multivariable analysis to provide relations to other variables evidence, mean assessment scores were independently associated with in-training examination scores (beta = 0.088 per 10-point increase; p = 0.05) and receiving a departmental or institutional award (beta = 0.152; p = 0.001). Assessment scores were not associated with educational conference attendance, compliance with completion of required evaluations, faculty appointment upon completion of training, or performance on the board certification exam. R(2) for the multivariable model was 0.25. CONCLUSIONS: These findings provide sound validity evidence establishing item content, internal consistency reliability, and associations with other variables for faculty-of-fellow clinical assessment scores that include MCFA-10 items during cardiology fellowship. Relations to other variables evidence included associations of assessment scores with performance on the in-training examination and receipt of competitive awards. These data support the utility of the MCFA-10 as a measure of performance during cardiology training and could serve as the foundation for future research on the assessment of subspecialty learners. BioMed Central 2022-03-15 /pmc/articles/PMC8925146/ /pubmed/35291995 http://dx.doi.org/10.1186/s12909-022-03239-4 Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/ (https://creativecommons.org/publicdomain/zero/1.0/) ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.
spellingShingle Research
Cullen, Michael W.
Klarich, Kyle W.
Baldwin, Kristine M.
Engstler, Gregory J.
Mandrekar, Jay
Scott, Christopher G.
Beckman, Thomas J.
Validity of a cardiology fellow performance assessment: reliability and associations with standardized examinations and awards
title Validity of a cardiology fellow performance assessment: reliability and associations with standardized examinations and awards
title_full Validity of a cardiology fellow performance assessment: reliability and associations with standardized examinations and awards
title_fullStr Validity of a cardiology fellow performance assessment: reliability and associations with standardized examinations and awards
title_full_unstemmed Validity of a cardiology fellow performance assessment: reliability and associations with standardized examinations and awards
title_short Validity of a cardiology fellow performance assessment: reliability and associations with standardized examinations and awards
title_sort validity of a cardiology fellow performance assessment: reliability and associations with standardized examinations and awards
topic Research
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8925146/
https://www.ncbi.nlm.nih.gov/pubmed/35291995
http://dx.doi.org/10.1186/s12909-022-03239-4
work_keys_str_mv AT cullenmichaelw validityofacardiologyfellowperformanceassessmentreliabilityandassociationswithstandardizedexaminationsandawards
AT klarichkylew validityofacardiologyfellowperformanceassessmentreliabilityandassociationswithstandardizedexaminationsandawards
AT baldwinkristinem validityofacardiologyfellowperformanceassessmentreliabilityandassociationswithstandardizedexaminationsandawards
AT engstlergregoryj validityofacardiologyfellowperformanceassessmentreliabilityandassociationswithstandardizedexaminationsandawards
AT mandrekarjay validityofacardiologyfellowperformanceassessmentreliabilityandassociationswithstandardizedexaminationsandawards
AT scottchristopherg validityofacardiologyfellowperformanceassessmentreliabilityandassociationswithstandardizedexaminationsandawards
AT beckmanthomasj validityofacardiologyfellowperformanceassessmentreliabilityandassociationswithstandardizedexaminationsandawards