Cargando…
Reliability of the Interprofessional Collaborator Assessment Rubric (ICAR) in Multi Source Feedback (MSF) with post-graduate medical residents
BACKGROUND: Increased attention on collaboration and teamwork competency development in medical education has raised the need for valid and reliable approaches to the assessment of collaboration competencies in post-graduate medical education. The purpose of this study was to evaluate the reliabilit...
Autores principales: | , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
BioMed Central
2014
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4318203/ https://www.ncbi.nlm.nih.gov/pubmed/25551678 http://dx.doi.org/10.1186/s12909-014-0279-9 |
_version_ | 1782355822225915904 |
---|---|
author | Hayward, Mark F Curran, Vernon Curtis, Bryan Schulz, Henry Murphy, Sean |
author_facet | Hayward, Mark F Curran, Vernon Curtis, Bryan Schulz, Henry Murphy, Sean |
author_sort | Hayward, Mark F |
collection | PubMed |
description | BACKGROUND: Increased attention on collaboration and teamwork competency development in medical education has raised the need for valid and reliable approaches to the assessment of collaboration competencies in post-graduate medical education. The purpose of this study was to evaluate the reliability of a modified Interprofessional Collaborator Assessment Rubric (ICAR) in a multi-source feedback (MSF) process for assessing post-graduate medical residents’ collaborator competencies. METHODS: Post-graduate medical residents (n = 16) received ICAR assessments from three different rater groups (physicians, nurses and allied health professionals) over a four-week rotation. Internal consistency, inter-rater reliability, inter-group differences and relationship between rater characteristics and ICAR scores were analyzed using Cronbach’s alpha, one-way and two-way repeated measures ANOVA, and logistic regression. RESULTS: Missing data decreased from 13.1% using daily assessments to 8.8% utilizing an MSF process, p = .032. High internal consistency measures were demonstrated for overall ICAR scores (α = .981) and individual assessment domains within the ICAR (α = .881 to .963). There were no significant differences between scores of physician, nurse, and allied health raters on collaborator competencies (F2,5 = 1.225, p = .297, η2 = .016). Rater gender was the only significant factor influencing scores with female raters scoring residents significantly lower than male raters (6.12 v. 6.82; F1,5 = 7.184, p = .008, η 2 = .045). CONCLUSION: The study findings suggest that the use of the modified ICAR in a MSF assessment process could be a feasible and reliable assessment approach to providing formative feedback to post-graduate medical residents on collaborator competencies. |
format | Online Article Text |
id | pubmed-4318203 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2014 |
publisher | BioMed Central |
record_format | MEDLINE/PubMed |
spelling | pubmed-43182032015-02-06 Reliability of the Interprofessional Collaborator Assessment Rubric (ICAR) in Multi Source Feedback (MSF) with post-graduate medical residents Hayward, Mark F Curran, Vernon Curtis, Bryan Schulz, Henry Murphy, Sean BMC Med Educ Research Article BACKGROUND: Increased attention on collaboration and teamwork competency development in medical education has raised the need for valid and reliable approaches to the assessment of collaboration competencies in post-graduate medical education. The purpose of this study was to evaluate the reliability of a modified Interprofessional Collaborator Assessment Rubric (ICAR) in a multi-source feedback (MSF) process for assessing post-graduate medical residents’ collaborator competencies. METHODS: Post-graduate medical residents (n = 16) received ICAR assessments from three different rater groups (physicians, nurses and allied health professionals) over a four-week rotation. Internal consistency, inter-rater reliability, inter-group differences and relationship between rater characteristics and ICAR scores were analyzed using Cronbach’s alpha, one-way and two-way repeated measures ANOVA, and logistic regression. RESULTS: Missing data decreased from 13.1% using daily assessments to 8.8% utilizing an MSF process, p = .032. High internal consistency measures were demonstrated for overall ICAR scores (α = .981) and individual assessment domains within the ICAR (α = .881 to .963). There were no significant differences between scores of physician, nurse, and allied health raters on collaborator competencies (F2,5 = 1.225, p = .297, η2 = .016). Rater gender was the only significant factor influencing scores with female raters scoring residents significantly lower than male raters (6.12 v. 6.82; F1,5 = 7.184, p = .008, η 2 = .045). CONCLUSION: The study findings suggest that the use of the modified ICAR in a MSF assessment process could be a feasible and reliable assessment approach to providing formative feedback to post-graduate medical residents on collaborator competencies. BioMed Central 2014-12-31 /pmc/articles/PMC4318203/ /pubmed/25551678 http://dx.doi.org/10.1186/s12909-014-0279-9 Text en © Hayward et al.; licensee BioMed Central. 2014 This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly credited. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated. |
spellingShingle | Research Article Hayward, Mark F Curran, Vernon Curtis, Bryan Schulz, Henry Murphy, Sean Reliability of the Interprofessional Collaborator Assessment Rubric (ICAR) in Multi Source Feedback (MSF) with post-graduate medical residents |
title | Reliability of the Interprofessional Collaborator Assessment Rubric (ICAR) in Multi Source Feedback (MSF) with post-graduate medical residents |
title_full | Reliability of the Interprofessional Collaborator Assessment Rubric (ICAR) in Multi Source Feedback (MSF) with post-graduate medical residents |
title_fullStr | Reliability of the Interprofessional Collaborator Assessment Rubric (ICAR) in Multi Source Feedback (MSF) with post-graduate medical residents |
title_full_unstemmed | Reliability of the Interprofessional Collaborator Assessment Rubric (ICAR) in Multi Source Feedback (MSF) with post-graduate medical residents |
title_short | Reliability of the Interprofessional Collaborator Assessment Rubric (ICAR) in Multi Source Feedback (MSF) with post-graduate medical residents |
title_sort | reliability of the interprofessional collaborator assessment rubric (icar) in multi source feedback (msf) with post-graduate medical residents |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4318203/ https://www.ncbi.nlm.nih.gov/pubmed/25551678 http://dx.doi.org/10.1186/s12909-014-0279-9 |
work_keys_str_mv | AT haywardmarkf reliabilityoftheinterprofessionalcollaboratorassessmentrubricicarinmultisourcefeedbackmsfwithpostgraduatemedicalresidents AT curranvernon reliabilityoftheinterprofessionalcollaboratorassessmentrubricicarinmultisourcefeedbackmsfwithpostgraduatemedicalresidents AT curtisbryan reliabilityoftheinterprofessionalcollaboratorassessmentrubricicarinmultisourcefeedbackmsfwithpostgraduatemedicalresidents AT schulzhenry reliabilityoftheinterprofessionalcollaboratorassessmentrubricicarinmultisourcefeedbackmsfwithpostgraduatemedicalresidents AT murphysean reliabilityoftheinterprofessionalcollaboratorassessmentrubricicarinmultisourcefeedbackmsfwithpostgraduatemedicalresidents |