Cargando…
The practice of ‘doing’ evaluation: lessons learned from nine complex intervention trials in action
BACKGROUND: There is increasing recognition among trialists of the challenges in understanding how particular ‘real-life’ contexts influence the delivery and receipt of complex health interventions. Evaluations of interventions to change health worker and/or patient behaviours in health service sett...
Autores principales: | , , , , , , , , , , , , , , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
BioMed Central
2014
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4079170/ https://www.ncbi.nlm.nih.gov/pubmed/24935096 http://dx.doi.org/10.1186/1748-5908-9-75 |
_version_ | 1782323817598681088 |
---|---|
author | Reynolds, Joanna DiLiberto, Deborah Mangham-Jefferies, Lindsay Ansah, Evelyn K Lal, Sham Mbakilwa, Hilda Bruxvoort, Katia Webster, Jayne Vestergaard, Lasse S Yeung, Shunmay Leslie, Toby Hutchinson, Eleanor Reyburn, Hugh Lalloo, David G Schellenberg, David Cundill, Bonnie Staedke, Sarah G Wiseman, Virginia Goodman, Catherine Chandler, Clare IR |
author_facet | Reynolds, Joanna DiLiberto, Deborah Mangham-Jefferies, Lindsay Ansah, Evelyn K Lal, Sham Mbakilwa, Hilda Bruxvoort, Katia Webster, Jayne Vestergaard, Lasse S Yeung, Shunmay Leslie, Toby Hutchinson, Eleanor Reyburn, Hugh Lalloo, David G Schellenberg, David Cundill, Bonnie Staedke, Sarah G Wiseman, Virginia Goodman, Catherine Chandler, Clare IR |
author_sort | Reynolds, Joanna |
collection | PubMed |
description | BACKGROUND: There is increasing recognition among trialists of the challenges in understanding how particular ‘real-life’ contexts influence the delivery and receipt of complex health interventions. Evaluations of interventions to change health worker and/or patient behaviours in health service settings exemplify these challenges. When interpreting evaluation data, deviation from intended intervention implementation is accounted for through process evaluations of fidelity, reach, and intensity. However, no such systematic approach has been proposed to account for the way evaluation activities may deviate in practice from assumptions made when data are interpreted. METHODS: A collective case study was conducted to explore experiences of undertaking evaluation activities in the real-life contexts of nine complex intervention trials seeking to improve appropriate diagnosis and treatment of malaria in varied health service settings. Multiple sources of data were used, including in-depth interviews with investigators, participant-observation of studies, and rounds of discussion and reflection. RESULTS AND DISCUSSION: From our experiences of the realities of conducting these evaluations, we identified six key ‘lessons learned’ about ways to become aware of and manage aspects of the fabric of trials involving the interface of researchers, fieldworkers, participants and data collection tools that may affect the intended production of data and interpretation of findings. These lessons included: foster a shared understanding across the study team of how individual practices contribute to the study goals; promote and facilitate within-team communications for ongoing reflection on the progress of the evaluation; establish processes for ongoing collaboration and dialogue between sub-study teams; the importance of a field research coordinator bridging everyday project management with scientific oversight; collect and review reflective field notes on the progress of the evaluation to aid interpretation of outcomes; and these approaches should help the identification of and reflection on possible overlaps between the evaluation and intervention. CONCLUSION: The lessons we have drawn point to the principle of reflexivity that, we argue, needs to become part of standard practice in the conduct of evaluations of complex interventions to promote more meaningful interpretations of the effects of an intervention and to better inform future implementation and decision-making. |
format | Online Article Text |
id | pubmed-4079170 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2014 |
publisher | BioMed Central |
record_format | MEDLINE/PubMed |
spelling | pubmed-40791702014-07-03 The practice of ‘doing’ evaluation: lessons learned from nine complex intervention trials in action Reynolds, Joanna DiLiberto, Deborah Mangham-Jefferies, Lindsay Ansah, Evelyn K Lal, Sham Mbakilwa, Hilda Bruxvoort, Katia Webster, Jayne Vestergaard, Lasse S Yeung, Shunmay Leslie, Toby Hutchinson, Eleanor Reyburn, Hugh Lalloo, David G Schellenberg, David Cundill, Bonnie Staedke, Sarah G Wiseman, Virginia Goodman, Catherine Chandler, Clare IR Implement Sci Research BACKGROUND: There is increasing recognition among trialists of the challenges in understanding how particular ‘real-life’ contexts influence the delivery and receipt of complex health interventions. Evaluations of interventions to change health worker and/or patient behaviours in health service settings exemplify these challenges. When interpreting evaluation data, deviation from intended intervention implementation is accounted for through process evaluations of fidelity, reach, and intensity. However, no such systematic approach has been proposed to account for the way evaluation activities may deviate in practice from assumptions made when data are interpreted. METHODS: A collective case study was conducted to explore experiences of undertaking evaluation activities in the real-life contexts of nine complex intervention trials seeking to improve appropriate diagnosis and treatment of malaria in varied health service settings. Multiple sources of data were used, including in-depth interviews with investigators, participant-observation of studies, and rounds of discussion and reflection. RESULTS AND DISCUSSION: From our experiences of the realities of conducting these evaluations, we identified six key ‘lessons learned’ about ways to become aware of and manage aspects of the fabric of trials involving the interface of researchers, fieldworkers, participants and data collection tools that may affect the intended production of data and interpretation of findings. These lessons included: foster a shared understanding across the study team of how individual practices contribute to the study goals; promote and facilitate within-team communications for ongoing reflection on the progress of the evaluation; establish processes for ongoing collaboration and dialogue between sub-study teams; the importance of a field research coordinator bridging everyday project management with scientific oversight; collect and review reflective field notes on the progress of the evaluation to aid interpretation of outcomes; and these approaches should help the identification of and reflection on possible overlaps between the evaluation and intervention. CONCLUSION: The lessons we have drawn point to the principle of reflexivity that, we argue, needs to become part of standard practice in the conduct of evaluations of complex interventions to promote more meaningful interpretations of the effects of an intervention and to better inform future implementation and decision-making. BioMed Central 2014-06-17 /pmc/articles/PMC4079170/ /pubmed/24935096 http://dx.doi.org/10.1186/1748-5908-9-75 Text en Copyright © 2014 Reynolds et al.; licensee BioMed Central Ltd. http://creativecommons.org/licenses/by/4.0 This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly credited. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated. |
spellingShingle | Research Reynolds, Joanna DiLiberto, Deborah Mangham-Jefferies, Lindsay Ansah, Evelyn K Lal, Sham Mbakilwa, Hilda Bruxvoort, Katia Webster, Jayne Vestergaard, Lasse S Yeung, Shunmay Leslie, Toby Hutchinson, Eleanor Reyburn, Hugh Lalloo, David G Schellenberg, David Cundill, Bonnie Staedke, Sarah G Wiseman, Virginia Goodman, Catherine Chandler, Clare IR The practice of ‘doing’ evaluation: lessons learned from nine complex intervention trials in action |
title | The practice of ‘doing’ evaluation: lessons learned from nine complex intervention trials in action |
title_full | The practice of ‘doing’ evaluation: lessons learned from nine complex intervention trials in action |
title_fullStr | The practice of ‘doing’ evaluation: lessons learned from nine complex intervention trials in action |
title_full_unstemmed | The practice of ‘doing’ evaluation: lessons learned from nine complex intervention trials in action |
title_short | The practice of ‘doing’ evaluation: lessons learned from nine complex intervention trials in action |
title_sort | practice of ‘doing’ evaluation: lessons learned from nine complex intervention trials in action |
topic | Research |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4079170/ https://www.ncbi.nlm.nih.gov/pubmed/24935096 http://dx.doi.org/10.1186/1748-5908-9-75 |
work_keys_str_mv | AT reynoldsjoanna thepracticeofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT dilibertodeborah thepracticeofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT manghamjefferieslindsay thepracticeofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT ansahevelynk thepracticeofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT lalsham thepracticeofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT mbakilwahilda thepracticeofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT bruxvoortkatia thepracticeofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT websterjayne thepracticeofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT vestergaardlasses thepracticeofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT yeungshunmay thepracticeofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT leslietoby thepracticeofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT hutchinsoneleanor thepracticeofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT reyburnhugh thepracticeofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT lalloodavidg thepracticeofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT schellenbergdavid thepracticeofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT cundillbonnie thepracticeofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT staedkesarahg thepracticeofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT wisemanvirginia thepracticeofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT goodmancatherine thepracticeofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT chandlerclareir thepracticeofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT reynoldsjoanna practiceofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT dilibertodeborah practiceofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT manghamjefferieslindsay practiceofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT ansahevelynk practiceofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT lalsham practiceofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT mbakilwahilda practiceofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT bruxvoortkatia practiceofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT websterjayne practiceofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT vestergaardlasses practiceofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT yeungshunmay practiceofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT leslietoby practiceofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT hutchinsoneleanor practiceofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT reyburnhugh practiceofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT lalloodavidg practiceofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT schellenbergdavid practiceofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT cundillbonnie practiceofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT staedkesarahg practiceofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT wisemanvirginia practiceofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT goodmancatherine practiceofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction AT chandlerclareir practiceofdoingevaluationlessonslearnedfromninecomplexinterventiontrialsinaction |