Cargando…

AN AUDIT OF ASSESSMENT TOOLS IN A MEDICAL SCHOOL IN EASTERN SAUDI ARABIA

BACKGROUND: Assessment has a powerful influence on curriculum delivery. Medical instructors must use tools which conform to educational principles, and audit them as part of curriculum review. AIM: To generate information to support recommendations for improving curriculum delivery. SETTING: Pre-cli...

Descripción completa

Detalles Bibliográficos
Autores principales: Al-Rubaish, Abdullah M., Al-Umran, Khalid U., Wosornu, Lade
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Medknow Publications & Media Pvt Ltd 2005
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3410120/
https://www.ncbi.nlm.nih.gov/pubmed/23012084
_version_ 1782239680381583360
author Al-Rubaish, Abdullah M.
Al-Umran, Khalid U.
Wosornu, Lade
author_facet Al-Rubaish, Abdullah M.
Al-Umran, Khalid U.
Wosornu, Lade
author_sort Al-Rubaish, Abdullah M.
collection PubMed
description BACKGROUND: Assessment has a powerful influence on curriculum delivery. Medical instructors must use tools which conform to educational principles, and audit them as part of curriculum review. AIM: To generate information to support recommendations for improving curriculum delivery. SETTING: Pre-clinical and clinical departments in a College of Medicine, Saudi Arabia. METHOD: A self-administered questionnaire was used in a cross-sectional survey to see if assessment tools being used met basic standards of validity, reliability and currency, and if feedback to students was adequate. Excluded were cost, feasibility and tool combinations. RESULTS: Thirty-one (out of 34) courses were evaluated. All 31 respondents used MCQs, especially one-best (28/31) and true/false (13/31). Groups of teachers selected test questions mostly. Pre-clinical departments sourced equally from “new” (10/14) and “used” (10/14) MCQs; clinical departments relied on ‘banked’ MCQs (16/17). Departments decided pass marks (28/31) and chose the College-set 60%; the timing was pre-examination in 13/17 clinical but post-examination in 5/14 pre-clinical departments. Of six essay users, five used model answers but only one did double marking. OSCE was used by 7/17 clinical departments; five provided checklist. Only 3/31 used optical reader. Post-marking review was done by 13/14 pre-clinical but 10/17 clinical departments. Difficulty and discriminating indices were determined by only 4/31 departments. Feedback was provided by 12/14 pre-clinical and 7/17 clinical departments. Only 10/31 course coordinators had copies of examination regulations. RECOMMENDATIONS: MCQ with single-best answer, if properly constructed and adequately critiqued, is the preferred tool for assessing theory domain. However, there should be fresh questions, item analyses, comparisons with pervious results, optical reader systems and double marking. Departments should use OSCE or OSPE more often. Long essays, true/false, fill-in-the-blank-spaces and more-than-one-correct-answer can be safely abolished. Departments or teams should set test papers and collectively take decisions. Feedback rates should be improved. A Center of Medical Education, including an Examination Center is required. Fruitful future studies can be repeat audit, use of “negative questions” and the number of MCQs per test paper. Comparative audit involving other regional medical schools may be of general interest.
format Online
Article
Text
id pubmed-3410120
institution National Center for Biotechnology Information
language English
publishDate 2005
publisher Medknow Publications & Media Pvt Ltd
record_format MEDLINE/PubMed
spelling pubmed-34101202012-09-24 AN AUDIT OF ASSESSMENT TOOLS IN A MEDICAL SCHOOL IN EASTERN SAUDI ARABIA Al-Rubaish, Abdullah M. Al-Umran, Khalid U. Wosornu, Lade J Family Community Med Medical Education BACKGROUND: Assessment has a powerful influence on curriculum delivery. Medical instructors must use tools which conform to educational principles, and audit them as part of curriculum review. AIM: To generate information to support recommendations for improving curriculum delivery. SETTING: Pre-clinical and clinical departments in a College of Medicine, Saudi Arabia. METHOD: A self-administered questionnaire was used in a cross-sectional survey to see if assessment tools being used met basic standards of validity, reliability and currency, and if feedback to students was adequate. Excluded were cost, feasibility and tool combinations. RESULTS: Thirty-one (out of 34) courses were evaluated. All 31 respondents used MCQs, especially one-best (28/31) and true/false (13/31). Groups of teachers selected test questions mostly. Pre-clinical departments sourced equally from “new” (10/14) and “used” (10/14) MCQs; clinical departments relied on ‘banked’ MCQs (16/17). Departments decided pass marks (28/31) and chose the College-set 60%; the timing was pre-examination in 13/17 clinical but post-examination in 5/14 pre-clinical departments. Of six essay users, five used model answers but only one did double marking. OSCE was used by 7/17 clinical departments; five provided checklist. Only 3/31 used optical reader. Post-marking review was done by 13/14 pre-clinical but 10/17 clinical departments. Difficulty and discriminating indices were determined by only 4/31 departments. Feedback was provided by 12/14 pre-clinical and 7/17 clinical departments. Only 10/31 course coordinators had copies of examination regulations. RECOMMENDATIONS: MCQ with single-best answer, if properly constructed and adequately critiqued, is the preferred tool for assessing theory domain. However, there should be fresh questions, item analyses, comparisons with pervious results, optical reader systems and double marking. Departments should use OSCE or OSPE more often. Long essays, true/false, fill-in-the-blank-spaces and more-than-one-correct-answer can be safely abolished. Departments or teams should set test papers and collectively take decisions. Feedback rates should be improved. A Center of Medical Education, including an Examination Center is required. Fruitful future studies can be repeat audit, use of “negative questions” and the number of MCQs per test paper. Comparative audit involving other regional medical schools may be of general interest. Medknow Publications & Media Pvt Ltd 2005 /pmc/articles/PMC3410120/ /pubmed/23012084 Text en Copyright: © Journal of Family and Community Medicine http://creativecommons.org/licenses/by-nc-sa/3.0 This is an open-access article distributed under the terms of the Creative Commons Attribution-Noncommercial-Share Alike 3.0 Unported, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
spellingShingle Medical Education
Al-Rubaish, Abdullah M.
Al-Umran, Khalid U.
Wosornu, Lade
AN AUDIT OF ASSESSMENT TOOLS IN A MEDICAL SCHOOL IN EASTERN SAUDI ARABIA
title AN AUDIT OF ASSESSMENT TOOLS IN A MEDICAL SCHOOL IN EASTERN SAUDI ARABIA
title_full AN AUDIT OF ASSESSMENT TOOLS IN A MEDICAL SCHOOL IN EASTERN SAUDI ARABIA
title_fullStr AN AUDIT OF ASSESSMENT TOOLS IN A MEDICAL SCHOOL IN EASTERN SAUDI ARABIA
title_full_unstemmed AN AUDIT OF ASSESSMENT TOOLS IN A MEDICAL SCHOOL IN EASTERN SAUDI ARABIA
title_short AN AUDIT OF ASSESSMENT TOOLS IN A MEDICAL SCHOOL IN EASTERN SAUDI ARABIA
title_sort audit of assessment tools in a medical school in eastern saudi arabia
topic Medical Education
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3410120/
https://www.ncbi.nlm.nih.gov/pubmed/23012084
work_keys_str_mv AT alrubaishabdullahm anauditofassessmenttoolsinamedicalschoolineasternsaudiarabia
AT alumrankhalidu anauditofassessmenttoolsinamedicalschoolineasternsaudiarabia
AT wosornulade anauditofassessmenttoolsinamedicalschoolineasternsaudiarabia
AT alrubaishabdullahm auditofassessmenttoolsinamedicalschoolineasternsaudiarabia
AT alumrankhalidu auditofassessmenttoolsinamedicalschoolineasternsaudiarabia
AT wosornulade auditofassessmenttoolsinamedicalschoolineasternsaudiarabia