Cargando…
Lessons from the evaluation of the UK's NHS R&D Implementation Methods Programme
BACKGROUND: Concern about the effective use of research was a major factor behind the creation of the NHS R&D Programme in 1991. In 1994, an advisory group was established to identify research priorities in research implementation. The Implementation Methods Programme (IMP) flowed from this, and...
Autores principales: | , |
---|---|
Formato: | Texto |
Lenguaje: | English |
Publicado: |
BioMed Central
2007
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1805450/ https://www.ncbi.nlm.nih.gov/pubmed/17309803 http://dx.doi.org/10.1186/1748-5908-2-7 |
_version_ | 1782132480164233216 |
---|---|
author | Soper, Bryony Hanney, Stephen R |
author_facet | Soper, Bryony Hanney, Stephen R |
author_sort | Soper, Bryony |
collection | PubMed |
description | BACKGROUND: Concern about the effective use of research was a major factor behind the creation of the NHS R&D Programme in 1991. In 1994, an advisory group was established to identify research priorities in research implementation. The Implementation Methods Programme (IMP) flowed from this, and its commissioning group funded 36 projects. In 2000 responsibility for the programme passed to the National Co-ordinating Centre for NHS Service Delivery and Organisation R&D, which asked the Health Economics Research Group (HERG), Brunel University, to conduct an evaluation in 2002. By then most projects had been completed. This evaluation was intended to cover: the quality of outputs, lessons to be learnt about the communication strategy and the commissioning process, and the benefits from the projects. METHODS: We adopted a wide range of quantitative and qualitative methods. They included: documentary analysis, interviews with key actors, questionnaires to the funded lead researchers, questionnaires to potential users, and desk analysis. RESULTS: Quantitative assessment of outputs and dissemination revealed that the IMP funded useful research projects, some of which had considerable impact against the various categories in the HERG payback model, such as publications, further research, research training, impact on health policy, and clinical practice. Qualitative findings from interviews with advisory and commissioning group members indicated that when the IMP was established, implementation research was a relatively unexplored field. This was reflected in the understanding brought to their roles by members of the advisory and commissioning groups, in the way priorities for research were chosen and developed, and in how the research projects were commissioned. The ideological and methodological debates associated with these decisions have continued among those working in this field. The need for an effective communication strategy for the programme as a whole was particularly important. However, such a strategy was never developed, making it difficult to establish the general influence of the IMP as a programme. CONCLUSION: Our findings about the impact of the work funded, and the difficulties faced by those developing the IMP, have implications for the development of strategic programmes of research in general, as well as for the development of more effective research in this field. |
format | Text |
id | pubmed-1805450 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2007 |
publisher | BioMed Central |
record_format | MEDLINE/PubMed |
spelling | pubmed-18054502007-02-27 Lessons from the evaluation of the UK's NHS R&D Implementation Methods Programme Soper, Bryony Hanney, Stephen R Implement Sci Research Article BACKGROUND: Concern about the effective use of research was a major factor behind the creation of the NHS R&D Programme in 1991. In 1994, an advisory group was established to identify research priorities in research implementation. The Implementation Methods Programme (IMP) flowed from this, and its commissioning group funded 36 projects. In 2000 responsibility for the programme passed to the National Co-ordinating Centre for NHS Service Delivery and Organisation R&D, which asked the Health Economics Research Group (HERG), Brunel University, to conduct an evaluation in 2002. By then most projects had been completed. This evaluation was intended to cover: the quality of outputs, lessons to be learnt about the communication strategy and the commissioning process, and the benefits from the projects. METHODS: We adopted a wide range of quantitative and qualitative methods. They included: documentary analysis, interviews with key actors, questionnaires to the funded lead researchers, questionnaires to potential users, and desk analysis. RESULTS: Quantitative assessment of outputs and dissemination revealed that the IMP funded useful research projects, some of which had considerable impact against the various categories in the HERG payback model, such as publications, further research, research training, impact on health policy, and clinical practice. Qualitative findings from interviews with advisory and commissioning group members indicated that when the IMP was established, implementation research was a relatively unexplored field. This was reflected in the understanding brought to their roles by members of the advisory and commissioning groups, in the way priorities for research were chosen and developed, and in how the research projects were commissioned. The ideological and methodological debates associated with these decisions have continued among those working in this field. The need for an effective communication strategy for the programme as a whole was particularly important. However, such a strategy was never developed, making it difficult to establish the general influence of the IMP as a programme. CONCLUSION: Our findings about the impact of the work funded, and the difficulties faced by those developing the IMP, have implications for the development of strategic programmes of research in general, as well as for the development of more effective research in this field. BioMed Central 2007-02-19 /pmc/articles/PMC1805450/ /pubmed/17309803 http://dx.doi.org/10.1186/1748-5908-2-7 Text en Copyright © 2007 Soper and Hanney; licensee BioMed Central Ltd. http://creativecommons.org/licenses/by/2.0 This is an Open Access article distributed under the terms of the Creative Commons Attribution License ( (http://creativecommons.org/licenses/by/2.0) ), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. |
spellingShingle | Research Article Soper, Bryony Hanney, Stephen R Lessons from the evaluation of the UK's NHS R&D Implementation Methods Programme |
title | Lessons from the evaluation of the UK's NHS R&D Implementation Methods Programme |
title_full | Lessons from the evaluation of the UK's NHS R&D Implementation Methods Programme |
title_fullStr | Lessons from the evaluation of the UK's NHS R&D Implementation Methods Programme |
title_full_unstemmed | Lessons from the evaluation of the UK's NHS R&D Implementation Methods Programme |
title_short | Lessons from the evaluation of the UK's NHS R&D Implementation Methods Programme |
title_sort | lessons from the evaluation of the uk's nhs r&d implementation methods programme |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC1805450/ https://www.ncbi.nlm.nih.gov/pubmed/17309803 http://dx.doi.org/10.1186/1748-5908-2-7 |
work_keys_str_mv | AT soperbryony lessonsfromtheevaluationoftheuksnhsrdimplementationmethodsprogramme AT hanneystephenr lessonsfromtheevaluationoftheuksnhsrdimplementationmethodsprogramme |