Cargando…

Standardizing an approach to the evaluation of implementation science proposals

BACKGROUND: The fields of implementation and improvement sciences have experienced rapid growth in recent years. However, research that seeks to inform health care change may have difficulty translating core components of implementation and improvement sciences within the traditional paradigms used...

Descripción completa

Detalles Bibliográficos
Autores principales: Crable, Erika L., Biancarelli, Dea, Walkey, Allan J., Allen, Caitlin G., Proctor, Enola K., Drainoni, Mari-Lynn
Formato: Online Artículo Texto
Lenguaje:English
Publicado: BioMed Central 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5975262/
https://www.ncbi.nlm.nih.gov/pubmed/29843740
http://dx.doi.org/10.1186/s13012-018-0770-5
_version_ 1783326951751024640
author Crable, Erika L.
Biancarelli, Dea
Walkey, Allan J.
Allen, Caitlin G.
Proctor, Enola K.
Drainoni, Mari-Lynn
author_facet Crable, Erika L.
Biancarelli, Dea
Walkey, Allan J.
Allen, Caitlin G.
Proctor, Enola K.
Drainoni, Mari-Lynn
author_sort Crable, Erika L.
collection PubMed
description BACKGROUND: The fields of implementation and improvement sciences have experienced rapid growth in recent years. However, research that seeks to inform health care change may have difficulty translating core components of implementation and improvement sciences within the traditional paradigms used to evaluate efficacy and effectiveness research. A review of implementation and improvement sciences grant proposals within an academic medical center using a traditional National Institutes of Health framework highlighted the need for tools that could assist investigators and reviewers in describing and evaluating proposed implementation and improvement sciences research. METHODS: We operationalized existing recommendations for writing implementation science proposals as the ImplemeNtation and Improvement Science Proposals Evaluation CriTeria (INSPECT) scoring system. The resulting system was applied to pilot grants submitted to a call for implementation and improvement science proposals at an academic medical center. We evaluated the reliability of the INSPECT system using Krippendorff’s alpha coefficients and explored the utility of the INSPECT system to characterize common deficiencies in implementation research proposals. RESULTS: We scored 30 research proposals using the INSPECT system. Proposals received a median cumulative score of 7 out of a possible score of 30. Across individual elements of INSPECT, proposals scored highest for criteria rating evidence of a care or quality gap. Proposals generally performed poorly on all other criteria. Most proposals received scores of 0 for criteria identifying an evidence-based practice or treatment (50%), conceptual model and theoretical justification (70%), setting’s readiness to adopt new services/treatment/programs (54%), implementation strategy/process (67%), and measurement and analysis (70%). Inter-coder reliability testing showed excellent reliability (Krippendorff’s alpha coefficient 0.88) for the application of the scoring system overall and demonstrated reliability scores ranging from 0.77 to 0.99 for individual elements. CONCLUSIONS: The INSPECT scoring system presents a new scoring criteria with a high degree of inter-rater reliability and utility for evaluating the quality of implementation and improvement sciences grant proposals.
format Online
Article
Text
id pubmed-5975262
institution National Center for Biotechnology Information
language English
publishDate 2018
publisher BioMed Central
record_format MEDLINE/PubMed
spelling pubmed-59752622018-05-31 Standardizing an approach to the evaluation of implementation science proposals Crable, Erika L. Biancarelli, Dea Walkey, Allan J. Allen, Caitlin G. Proctor, Enola K. Drainoni, Mari-Lynn Implement Sci Methodology BACKGROUND: The fields of implementation and improvement sciences have experienced rapid growth in recent years. However, research that seeks to inform health care change may have difficulty translating core components of implementation and improvement sciences within the traditional paradigms used to evaluate efficacy and effectiveness research. A review of implementation and improvement sciences grant proposals within an academic medical center using a traditional National Institutes of Health framework highlighted the need for tools that could assist investigators and reviewers in describing and evaluating proposed implementation and improvement sciences research. METHODS: We operationalized existing recommendations for writing implementation science proposals as the ImplemeNtation and Improvement Science Proposals Evaluation CriTeria (INSPECT) scoring system. The resulting system was applied to pilot grants submitted to a call for implementation and improvement science proposals at an academic medical center. We evaluated the reliability of the INSPECT system using Krippendorff’s alpha coefficients and explored the utility of the INSPECT system to characterize common deficiencies in implementation research proposals. RESULTS: We scored 30 research proposals using the INSPECT system. Proposals received a median cumulative score of 7 out of a possible score of 30. Across individual elements of INSPECT, proposals scored highest for criteria rating evidence of a care or quality gap. Proposals generally performed poorly on all other criteria. Most proposals received scores of 0 for criteria identifying an evidence-based practice or treatment (50%), conceptual model and theoretical justification (70%), setting’s readiness to adopt new services/treatment/programs (54%), implementation strategy/process (67%), and measurement and analysis (70%). Inter-coder reliability testing showed excellent reliability (Krippendorff’s alpha coefficient 0.88) for the application of the scoring system overall and demonstrated reliability scores ranging from 0.77 to 0.99 for individual elements. CONCLUSIONS: The INSPECT scoring system presents a new scoring criteria with a high degree of inter-rater reliability and utility for evaluating the quality of implementation and improvement sciences grant proposals. BioMed Central 2018-05-29 /pmc/articles/PMC5975262/ /pubmed/29843740 http://dx.doi.org/10.1186/s13012-018-0770-5 Text en © The Author(s). 2018 Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.
spellingShingle Methodology
Crable, Erika L.
Biancarelli, Dea
Walkey, Allan J.
Allen, Caitlin G.
Proctor, Enola K.
Drainoni, Mari-Lynn
Standardizing an approach to the evaluation of implementation science proposals
title Standardizing an approach to the evaluation of implementation science proposals
title_full Standardizing an approach to the evaluation of implementation science proposals
title_fullStr Standardizing an approach to the evaluation of implementation science proposals
title_full_unstemmed Standardizing an approach to the evaluation of implementation science proposals
title_short Standardizing an approach to the evaluation of implementation science proposals
title_sort standardizing an approach to the evaluation of implementation science proposals
topic Methodology
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5975262/
https://www.ncbi.nlm.nih.gov/pubmed/29843740
http://dx.doi.org/10.1186/s13012-018-0770-5
work_keys_str_mv AT crableerikal standardizinganapproachtotheevaluationofimplementationscienceproposals
AT biancarellidea standardizinganapproachtotheevaluationofimplementationscienceproposals
AT walkeyallanj standardizinganapproachtotheevaluationofimplementationscienceproposals
AT allencaitling standardizinganapproachtotheevaluationofimplementationscienceproposals
AT proctorenolak standardizinganapproachtotheevaluationofimplementationscienceproposals
AT drainonimarilynn standardizinganapproachtotheevaluationofimplementationscienceproposals