Cargando…

Validation of educational assessments: a primer for simulation and beyond

BACKGROUND: Simulation plays a vital role in health professions assessment. This review provides a primer on assessment validation for educators and education researchers. We focus on simulation-based assessment of health professionals, but the principles apply broadly to other assessment approaches...

Descripción completa

Detalles Bibliográficos
Autores principales: Cook, David A., Hatala, Rose
Formato: Online Artículo Texto
Lenguaje:English
Publicado: BioMed Central 2016
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5806296/
https://www.ncbi.nlm.nih.gov/pubmed/29450000
http://dx.doi.org/10.1186/s41077-016-0033-y
_version_ 1783299099670347776
author Cook, David A.
Hatala, Rose
author_facet Cook, David A.
Hatala, Rose
author_sort Cook, David A.
collection PubMed
description BACKGROUND: Simulation plays a vital role in health professions assessment. This review provides a primer on assessment validation for educators and education researchers. We focus on simulation-based assessment of health professionals, but the principles apply broadly to other assessment approaches and topics. KEY PRINCIPLES: Validation refers to the process of collecting validity evidence to evaluate the appropriateness of the interpretations, uses, and decisions based on assessment results. Contemporary frameworks view validity as a hypothesis, and validity evidence is collected to support or refute the validity hypothesis (i.e., that the proposed interpretations and decisions are defensible). In validation, the educator or researcher defines the proposed interpretations and decisions, identifies and prioritizes the most questionable assumptions in making these interpretations and decisions (the “interpretation-use argument”), empirically tests those assumptions using existing or newly-collected evidence, and then summarizes the evidence as a coherent “validity argument.” A framework proposed by Messick identifies potential evidence sources: content, response process, internal structure, relationships with other variables, and consequences. Another framework proposed by Kane identifies key inferences in generating useful interpretations: scoring, generalization, extrapolation, and implications/decision. We propose an eight-step approach to validation that applies to either framework: Define the construct and proposed interpretation, make explicit the intended decision(s), define the interpretation-use argument and prioritize needed validity evidence, identify candidate instruments and/or create/adapt a new instrument, appraise existing evidence and collect new evidence as needed, keep track of practical issues, formulate the validity argument, and make a judgment: does the evidence support the intended use? CONCLUSIONS: Rigorous validation first prioritizes and then empirically evaluates key assumptions in the interpretation and use of assessment scores. Validation science would be improved by more explicit articulation and prioritization of the interpretation-use argument, greater use of formal validation frameworks, and more evidence informing the consequences and implications of assessment.
format Online
Article
Text
id pubmed-5806296
institution National Center for Biotechnology Information
language English
publishDate 2016
publisher BioMed Central
record_format MEDLINE/PubMed
spelling pubmed-58062962018-02-15 Validation of educational assessments: a primer for simulation and beyond Cook, David A. Hatala, Rose Adv Simul (Lond) Methodology Article BACKGROUND: Simulation plays a vital role in health professions assessment. This review provides a primer on assessment validation for educators and education researchers. We focus on simulation-based assessment of health professionals, but the principles apply broadly to other assessment approaches and topics. KEY PRINCIPLES: Validation refers to the process of collecting validity evidence to evaluate the appropriateness of the interpretations, uses, and decisions based on assessment results. Contemporary frameworks view validity as a hypothesis, and validity evidence is collected to support or refute the validity hypothesis (i.e., that the proposed interpretations and decisions are defensible). In validation, the educator or researcher defines the proposed interpretations and decisions, identifies and prioritizes the most questionable assumptions in making these interpretations and decisions (the “interpretation-use argument”), empirically tests those assumptions using existing or newly-collected evidence, and then summarizes the evidence as a coherent “validity argument.” A framework proposed by Messick identifies potential evidence sources: content, response process, internal structure, relationships with other variables, and consequences. Another framework proposed by Kane identifies key inferences in generating useful interpretations: scoring, generalization, extrapolation, and implications/decision. We propose an eight-step approach to validation that applies to either framework: Define the construct and proposed interpretation, make explicit the intended decision(s), define the interpretation-use argument and prioritize needed validity evidence, identify candidate instruments and/or create/adapt a new instrument, appraise existing evidence and collect new evidence as needed, keep track of practical issues, formulate the validity argument, and make a judgment: does the evidence support the intended use? CONCLUSIONS: Rigorous validation first prioritizes and then empirically evaluates key assumptions in the interpretation and use of assessment scores. Validation science would be improved by more explicit articulation and prioritization of the interpretation-use argument, greater use of formal validation frameworks, and more evidence informing the consequences and implications of assessment. BioMed Central 2016-12-07 /pmc/articles/PMC5806296/ /pubmed/29450000 http://dx.doi.org/10.1186/s41077-016-0033-y Text en © The Author(s) 2016 Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.
spellingShingle Methodology Article
Cook, David A.
Hatala, Rose
Validation of educational assessments: a primer for simulation and beyond
title Validation of educational assessments: a primer for simulation and beyond
title_full Validation of educational assessments: a primer for simulation and beyond
title_fullStr Validation of educational assessments: a primer for simulation and beyond
title_full_unstemmed Validation of educational assessments: a primer for simulation and beyond
title_short Validation of educational assessments: a primer for simulation and beyond
title_sort validation of educational assessments: a primer for simulation and beyond
topic Methodology Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5806296/
https://www.ncbi.nlm.nih.gov/pubmed/29450000
http://dx.doi.org/10.1186/s41077-016-0033-y
work_keys_str_mv AT cookdavida validationofeducationalassessmentsaprimerforsimulationandbeyond
AT hatalarose validationofeducationalassessmentsaprimerforsimulationandbeyond