Cargando…

Analysing synthesis of evidence in a systematic review in health professions education: observations on struggling beyond Kirkpatrick

Background: Systematic reviews in health professions education may well under-report struggles to synthesize disparate evidence that defies standard quantitative approaches. This paper reports further process analysis in a previously reported systematic review about mobile devices on clinical placem...

Descripción completa

Detalles Bibliográficos
Autores principales: Maudsley, Gillian, Taylor, David
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Taylor & Francis 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7170338/
https://www.ncbi.nlm.nih.gov/pubmed/32228373
http://dx.doi.org/10.1080/10872981.2020.1731278
_version_ 1783523873080213504
author Maudsley, Gillian
Taylor, David
author_facet Maudsley, Gillian
Taylor, David
author_sort Maudsley, Gillian
collection PubMed
description Background: Systematic reviews in health professions education may well under-report struggles to synthesize disparate evidence that defies standard quantitative approaches. This paper reports further process analysis in a previously reported systematic review about mobile devices on clinical placements. Objective: For a troublesome systematic review: (1) Analyse further the distribution and reliability of classifying the evidence to Maxwell quality dimensions (beyond ‘Does it work?’) and their overlap with Kirkpatrick K-levels. (2) Analyse how the abstracts represented those dimensions of the evidence-base. (3) Reflect on difficulties in synthesis and merits of Maxwell dimensions. Design: Following integrative synthesis of 45 K2–K4 primary studies (by combined content–thematic analysis in the pragmatism paradigm): (1) Hierarchical cluster analysis explored overlap between Maxwell dimensions and K-levels. Independent and consensus-coding to Maxwell dimensions compared (using: percentages; kappa; McNemar hypothesis-testing) pre- vs post-discussion and (2) article abstract vs main body. (3) Narrative summary captured process difficulties and merits. Results: (1) The largest cluster (five-cluster dendrogram) was acceptability–accessibility–K1–appropriateness–K3, with K1 and K4 widely separated. For article main bodies, independent coding agreed most for appropriateness (good; adjusted kappa = 0.78). Evidence increased significantly pre–post-discussion about acceptability (p = 0.008; 31/45→39/45), accessibility, and equity-ethics-professionalism. (2) Abstracts suggested efficiency significantly less than main bodies evidenced: 31.1% vs 44.4%, p = 0.031. 3) Challenges and merits emerged for before, during, and after the review. Conclusions: There should be more systematic reporting of process analysis about difficulties synthesizing suboptimal evidence-bases. In this example, Maxwell dimensions were a useful framework beyond K-levels for classifying and synthesizing the evidence-base.
format Online
Article
Text
id pubmed-7170338
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher Taylor & Francis
record_format MEDLINE/PubMed
spelling pubmed-71703382020-04-27 Analysing synthesis of evidence in a systematic review in health professions education: observations on struggling beyond Kirkpatrick Maudsley, Gillian Taylor, David Med Educ Online Article Background: Systematic reviews in health professions education may well under-report struggles to synthesize disparate evidence that defies standard quantitative approaches. This paper reports further process analysis in a previously reported systematic review about mobile devices on clinical placements. Objective: For a troublesome systematic review: (1) Analyse further the distribution and reliability of classifying the evidence to Maxwell quality dimensions (beyond ‘Does it work?’) and their overlap with Kirkpatrick K-levels. (2) Analyse how the abstracts represented those dimensions of the evidence-base. (3) Reflect on difficulties in synthesis and merits of Maxwell dimensions. Design: Following integrative synthesis of 45 K2–K4 primary studies (by combined content–thematic analysis in the pragmatism paradigm): (1) Hierarchical cluster analysis explored overlap between Maxwell dimensions and K-levels. Independent and consensus-coding to Maxwell dimensions compared (using: percentages; kappa; McNemar hypothesis-testing) pre- vs post-discussion and (2) article abstract vs main body. (3) Narrative summary captured process difficulties and merits. Results: (1) The largest cluster (five-cluster dendrogram) was acceptability–accessibility–K1–appropriateness–K3, with K1 and K4 widely separated. For article main bodies, independent coding agreed most for appropriateness (good; adjusted kappa = 0.78). Evidence increased significantly pre–post-discussion about acceptability (p = 0.008; 31/45→39/45), accessibility, and equity-ethics-professionalism. (2) Abstracts suggested efficiency significantly less than main bodies evidenced: 31.1% vs 44.4%, p = 0.031. 3) Challenges and merits emerged for before, during, and after the review. Conclusions: There should be more systematic reporting of process analysis about difficulties synthesizing suboptimal evidence-bases. In this example, Maxwell dimensions were a useful framework beyond K-levels for classifying and synthesizing the evidence-base. Taylor & Francis 2020-03-31 /pmc/articles/PMC7170338/ /pubmed/32228373 http://dx.doi.org/10.1080/10872981.2020.1731278 Text en © 2020 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group. https://creativecommons.org/licenses/by/4.0/This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) ), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
spellingShingle Article
Maudsley, Gillian
Taylor, David
Analysing synthesis of evidence in a systematic review in health professions education: observations on struggling beyond Kirkpatrick
title Analysing synthesis of evidence in a systematic review in health professions education: observations on struggling beyond Kirkpatrick
title_full Analysing synthesis of evidence in a systematic review in health professions education: observations on struggling beyond Kirkpatrick
title_fullStr Analysing synthesis of evidence in a systematic review in health professions education: observations on struggling beyond Kirkpatrick
title_full_unstemmed Analysing synthesis of evidence in a systematic review in health professions education: observations on struggling beyond Kirkpatrick
title_short Analysing synthesis of evidence in a systematic review in health professions education: observations on struggling beyond Kirkpatrick
title_sort analysing synthesis of evidence in a systematic review in health professions education: observations on struggling beyond kirkpatrick
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7170338/
https://www.ncbi.nlm.nih.gov/pubmed/32228373
http://dx.doi.org/10.1080/10872981.2020.1731278
work_keys_str_mv AT maudsleygillian analysingsynthesisofevidenceinasystematicreviewinhealthprofessionseducationobservationsonstrugglingbeyondkirkpatrick
AT taylordavid analysingsynthesisofevidenceinasystematicreviewinhealthprofessionseducationobservationsonstrugglingbeyondkirkpatrick