Cargando…

Psychometric analysis of multiple-choice questions in an innovative curriculum in Kingdom of Saudi Arabia

BACKGROUND AND AIMS: Worldwide, medical education and assessment of medical students are evolving. Psychometric analysis of the adopted assessment methods is thus, necessary for an efficient, reliable, valid and evidence based approach to the assessment of the students. The objective of this study w...

Descripción completa

Detalles Bibliográficos
Autores principales: Salih, Karim Eldin M. A., Jibo, Abubakar, Ishaq, Masoud, Khan, Sameer, Mohammed, Osama A., AL-Shahrani, Abdullah M., Abbas, Mohammed
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Wolters Kluwer - Medknow 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7567208/
https://www.ncbi.nlm.nih.gov/pubmed/33102347
http://dx.doi.org/10.4103/jfmpc.jfmpc_358_20
_version_ 1783596277814001664
author Salih, Karim Eldin M. A.
Jibo, Abubakar
Ishaq, Masoud
Khan, Sameer
Mohammed, Osama A.
AL-Shahrani, Abdullah M.
Abbas, Mohammed
author_facet Salih, Karim Eldin M. A.
Jibo, Abubakar
Ishaq, Masoud
Khan, Sameer
Mohammed, Osama A.
AL-Shahrani, Abdullah M.
Abbas, Mohammed
author_sort Salih, Karim Eldin M. A.
collection PubMed
description BACKGROUND AND AIMS: Worldwide, medical education and assessment of medical students are evolving. Psychometric analysis of the adopted assessment methods is thus, necessary for an efficient, reliable, valid and evidence based approach to the assessment of the students. The objective of this study was to determine the pattern of psychometric analysis of our courses conducted in the academic year 2018-2019, in an innovative curriculum. METHODS: It was a cross-sectional-design study involving review of examination items over one academic session -2018/2019. All exam item analysis of courses completed within the three phases of the year were analyzed using SPSS V20 statistical software. RESULTS: There were 24 courses conducted during the academic year 2018-2019, across the three academic phases. The total examination items were 1073 with 3219 distractors in one of four best option multiple choice questions (MCQs). The item analysis showed that the mean difficulty index (DIF I) was 79.1 ± 3.3. Items with good discrimination have a mean of 65 ± 11.2 and a distractor efficiency of 80.9%. Reliability Index (Kr20) across all exams in the three phases was 0.75. There was a significant difference within the examination items block (F = 12.31, F critical = 3.33, P < 0.05) across all the phases of the courses taken by the students. Similarly, significant differences existed among the three phases of the courses taken (F ratio = 12.44, F critical 4.10, P < 0.05). CONCLUSION: The psychometric analysis showed that the quality of examination questions was valid and reliable. Though differences were observed in items quality between different phases of study as well as within courses of study, it has generally remained consistent throughout the session. More efforts need to be channeled towards improving the quality in the future is recommended.
format Online
Article
Text
id pubmed-7567208
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher Wolters Kluwer - Medknow
record_format MEDLINE/PubMed
spelling pubmed-75672082020-10-22 Psychometric analysis of multiple-choice questions in an innovative curriculum in Kingdom of Saudi Arabia Salih, Karim Eldin M. A. Jibo, Abubakar Ishaq, Masoud Khan, Sameer Mohammed, Osama A. AL-Shahrani, Abdullah M. Abbas, Mohammed J Family Med Prim Care Original Article BACKGROUND AND AIMS: Worldwide, medical education and assessment of medical students are evolving. Psychometric analysis of the adopted assessment methods is thus, necessary for an efficient, reliable, valid and evidence based approach to the assessment of the students. The objective of this study was to determine the pattern of psychometric analysis of our courses conducted in the academic year 2018-2019, in an innovative curriculum. METHODS: It was a cross-sectional-design study involving review of examination items over one academic session -2018/2019. All exam item analysis of courses completed within the three phases of the year were analyzed using SPSS V20 statistical software. RESULTS: There were 24 courses conducted during the academic year 2018-2019, across the three academic phases. The total examination items were 1073 with 3219 distractors in one of four best option multiple choice questions (MCQs). The item analysis showed that the mean difficulty index (DIF I) was 79.1 ± 3.3. Items with good discrimination have a mean of 65 ± 11.2 and a distractor efficiency of 80.9%. Reliability Index (Kr20) across all exams in the three phases was 0.75. There was a significant difference within the examination items block (F = 12.31, F critical = 3.33, P < 0.05) across all the phases of the courses taken by the students. Similarly, significant differences existed among the three phases of the courses taken (F ratio = 12.44, F critical 4.10, P < 0.05). CONCLUSION: The psychometric analysis showed that the quality of examination questions was valid and reliable. Though differences were observed in items quality between different phases of study as well as within courses of study, it has generally remained consistent throughout the session. More efforts need to be channeled towards improving the quality in the future is recommended. Wolters Kluwer - Medknow 2020-07-30 /pmc/articles/PMC7567208/ /pubmed/33102347 http://dx.doi.org/10.4103/jfmpc.jfmpc_358_20 Text en Copyright: © 2020 Journal of Family Medicine and Primary Care http://creativecommons.org/licenses/by-nc-sa/4.0 This is an open access journal, and articles are distributed under the terms of the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 License, which allows others to remix, tweak, and build upon the work non-commercially, as long as appropriate credit is given and the new creations are licensed under the identical terms.
spellingShingle Original Article
Salih, Karim Eldin M. A.
Jibo, Abubakar
Ishaq, Masoud
Khan, Sameer
Mohammed, Osama A.
AL-Shahrani, Abdullah M.
Abbas, Mohammed
Psychometric analysis of multiple-choice questions in an innovative curriculum in Kingdom of Saudi Arabia
title Psychometric analysis of multiple-choice questions in an innovative curriculum in Kingdom of Saudi Arabia
title_full Psychometric analysis of multiple-choice questions in an innovative curriculum in Kingdom of Saudi Arabia
title_fullStr Psychometric analysis of multiple-choice questions in an innovative curriculum in Kingdom of Saudi Arabia
title_full_unstemmed Psychometric analysis of multiple-choice questions in an innovative curriculum in Kingdom of Saudi Arabia
title_short Psychometric analysis of multiple-choice questions in an innovative curriculum in Kingdom of Saudi Arabia
title_sort psychometric analysis of multiple-choice questions in an innovative curriculum in kingdom of saudi arabia
topic Original Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7567208/
https://www.ncbi.nlm.nih.gov/pubmed/33102347
http://dx.doi.org/10.4103/jfmpc.jfmpc_358_20
work_keys_str_mv AT salihkarimeldinma psychometricanalysisofmultiplechoicequestionsinaninnovativecurriculuminkingdomofsaudiarabia
AT jiboabubakar psychometricanalysisofmultiplechoicequestionsinaninnovativecurriculuminkingdomofsaudiarabia
AT ishaqmasoud psychometricanalysisofmultiplechoicequestionsinaninnovativecurriculuminkingdomofsaudiarabia
AT khansameer psychometricanalysisofmultiplechoicequestionsinaninnovativecurriculuminkingdomofsaudiarabia
AT mohammedosamaa psychometricanalysisofmultiplechoicequestionsinaninnovativecurriculuminkingdomofsaudiarabia
AT alshahraniabdullahm psychometricanalysisofmultiplechoicequestionsinaninnovativecurriculuminkingdomofsaudiarabia
AT abbasmohammed psychometricanalysisofmultiplechoicequestionsinaninnovativecurriculuminkingdomofsaudiarabia