Cargando…
Measuring medical students’ professional competencies in a problem-based curriculum: a reliability study
BACKGROUND: Identification and assessment of professional competencies for medical students is challenging. We have recently developed an instrument for assessing the essential professional competencies for medical students in Problem-Based Learning (PBL) programs by PBL tutors. This study aims to e...
Autores principales: | , , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
BioMed Central
2019
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6528362/ https://www.ncbi.nlm.nih.gov/pubmed/31113457 http://dx.doi.org/10.1186/s12909-019-1594-y |
_version_ | 1783420202046717952 |
---|---|
author | Kassab, Salah Eldin Du, Xiangyun Toft, Egon Cyprian, Farhan Al-Moslih, Ayad Schmidt, Henk Hamdy, Hossam Abu-Hijleh, Marwan |
author_facet | Kassab, Salah Eldin Du, Xiangyun Toft, Egon Cyprian, Farhan Al-Moslih, Ayad Schmidt, Henk Hamdy, Hossam Abu-Hijleh, Marwan |
author_sort | Kassab, Salah Eldin |
collection | PubMed |
description | BACKGROUND: Identification and assessment of professional competencies for medical students is challenging. We have recently developed an instrument for assessing the essential professional competencies for medical students in Problem-Based Learning (PBL) programs by PBL tutors. This study aims to evaluate the reliability and validity of professional competency scores of medical students using this instrument in PBL tutorials. METHODS: Each group of seven to eight students in PBL tutorials (Year 2, n = 46) were assessed independently by two faculty members. Each tutor assessed students in his/her group every five weeks on four occasions. The instrument consists of ten items, which measure three main competency domains: interpersonal, cognitive and professional behavior. Each item is scored using a five-point Likert scale (1 = poor, 5 = exceptional). Reliability of professional competencies scores was calculated using G-theory with raters nested in occasions. Furthermore, criterion-related validity was measured by testing the correlations with students’ scores in written examination. RESULTS: The overall generalizability coefficient (G) of the professional competency scores was 0.80. Students’ professional competencies scores (universe scores) accounted for 27% of the total variance across all score comparisons. The variance due to occasions accounted for 10%, while the student-occasion interaction was zero. The variance due to raters to occasions represented 8% of the total variance, and the remaining 55% of the variance was due to unexplained sources of error. The highest reliability measured was the interpersonal domain (G = 0.84) and the lowest reliability was the professional behavior domain (G = 0.76). Results from the decision (D) study suggested that an adequate dependability (G = 0.71) can be achieved by using one rater for five occasions. Furthermore, there was a positive correlation between the written examination scores and cognitive competencies scores (r = 0.46, P < 0.01), but not with the other two competency domains (interpersonal and professionalism). CONCLUSIONS: This study demonstrates that professional competency assessment scores of medical students in PBL tutorials have an acceptable reliability. Further studies for validating the instrument are required before using it for summative evaluation of students by PBL tutors. |
format | Online Article Text |
id | pubmed-6528362 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2019 |
publisher | BioMed Central |
record_format | MEDLINE/PubMed |
spelling | pubmed-65283622019-05-28 Measuring medical students’ professional competencies in a problem-based curriculum: a reliability study Kassab, Salah Eldin Du, Xiangyun Toft, Egon Cyprian, Farhan Al-Moslih, Ayad Schmidt, Henk Hamdy, Hossam Abu-Hijleh, Marwan BMC Med Educ Research Article BACKGROUND: Identification and assessment of professional competencies for medical students is challenging. We have recently developed an instrument for assessing the essential professional competencies for medical students in Problem-Based Learning (PBL) programs by PBL tutors. This study aims to evaluate the reliability and validity of professional competency scores of medical students using this instrument in PBL tutorials. METHODS: Each group of seven to eight students in PBL tutorials (Year 2, n = 46) were assessed independently by two faculty members. Each tutor assessed students in his/her group every five weeks on four occasions. The instrument consists of ten items, which measure three main competency domains: interpersonal, cognitive and professional behavior. Each item is scored using a five-point Likert scale (1 = poor, 5 = exceptional). Reliability of professional competencies scores was calculated using G-theory with raters nested in occasions. Furthermore, criterion-related validity was measured by testing the correlations with students’ scores in written examination. RESULTS: The overall generalizability coefficient (G) of the professional competency scores was 0.80. Students’ professional competencies scores (universe scores) accounted for 27% of the total variance across all score comparisons. The variance due to occasions accounted for 10%, while the student-occasion interaction was zero. The variance due to raters to occasions represented 8% of the total variance, and the remaining 55% of the variance was due to unexplained sources of error. The highest reliability measured was the interpersonal domain (G = 0.84) and the lowest reliability was the professional behavior domain (G = 0.76). Results from the decision (D) study suggested that an adequate dependability (G = 0.71) can be achieved by using one rater for five occasions. Furthermore, there was a positive correlation between the written examination scores and cognitive competencies scores (r = 0.46, P < 0.01), but not with the other two competency domains (interpersonal and professionalism). CONCLUSIONS: This study demonstrates that professional competency assessment scores of medical students in PBL tutorials have an acceptable reliability. Further studies for validating the instrument are required before using it for summative evaluation of students by PBL tutors. BioMed Central 2019-05-21 /pmc/articles/PMC6528362/ /pubmed/31113457 http://dx.doi.org/10.1186/s12909-019-1594-y Text en © The Author(s). 2019 Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated. |
spellingShingle | Research Article Kassab, Salah Eldin Du, Xiangyun Toft, Egon Cyprian, Farhan Al-Moslih, Ayad Schmidt, Henk Hamdy, Hossam Abu-Hijleh, Marwan Measuring medical students’ professional competencies in a problem-based curriculum: a reliability study |
title | Measuring medical students’ professional competencies in a problem-based curriculum: a reliability study |
title_full | Measuring medical students’ professional competencies in a problem-based curriculum: a reliability study |
title_fullStr | Measuring medical students’ professional competencies in a problem-based curriculum: a reliability study |
title_full_unstemmed | Measuring medical students’ professional competencies in a problem-based curriculum: a reliability study |
title_short | Measuring medical students’ professional competencies in a problem-based curriculum: a reliability study |
title_sort | measuring medical students’ professional competencies in a problem-based curriculum: a reliability study |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6528362/ https://www.ncbi.nlm.nih.gov/pubmed/31113457 http://dx.doi.org/10.1186/s12909-019-1594-y |
work_keys_str_mv | AT kassabsalaheldin measuringmedicalstudentsprofessionalcompetenciesinaproblembasedcurriculumareliabilitystudy AT duxiangyun measuringmedicalstudentsprofessionalcompetenciesinaproblembasedcurriculumareliabilitystudy AT toftegon measuringmedicalstudentsprofessionalcompetenciesinaproblembasedcurriculumareliabilitystudy AT cyprianfarhan measuringmedicalstudentsprofessionalcompetenciesinaproblembasedcurriculumareliabilitystudy AT almoslihayad measuringmedicalstudentsprofessionalcompetenciesinaproblembasedcurriculumareliabilitystudy AT schmidthenk measuringmedicalstudentsprofessionalcompetenciesinaproblembasedcurriculumareliabilitystudy AT hamdyhossam measuringmedicalstudentsprofessionalcompetenciesinaproblembasedcurriculumareliabilitystudy AT abuhijlehmarwan measuringmedicalstudentsprofessionalcompetenciesinaproblembasedcurriculumareliabilitystudy |