Cargando…

Common evidence gaps in point-of-care diagnostic test evaluation: a review of horizon scan reports

OBJECTIVE: Since 2008, the Oxford Diagnostic Horizon Scan Programme has been identifying and summarising evidence on new and emerging diagnostic technologies relevant to primary care. We used these reports to determine the sequence and timing of evidence for new point-of-care diagnostic tests and to...

Descripción completa

Detalles Bibliográficos
Autores principales: Verbakel, Jan Y, Turner, Philip J, Thompson, Matthew J, Plüddemann, Annette, Price, Christopher P, Shinkins, Bethany, Van den Bruel, Ann
Formato: Online Artículo Texto
Lenguaje:English
Publicado: BMJ Publishing Group 2017
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5588931/
https://www.ncbi.nlm.nih.gov/pubmed/28864692
http://dx.doi.org/10.1136/bmjopen-2016-015760
_version_ 1783262247225655296
author Verbakel, Jan Y
Turner, Philip J
Thompson, Matthew J
Plüddemann, Annette
Price, Christopher P
Shinkins, Bethany
Van den Bruel, Ann
author_facet Verbakel, Jan Y
Turner, Philip J
Thompson, Matthew J
Plüddemann, Annette
Price, Christopher P
Shinkins, Bethany
Van den Bruel, Ann
author_sort Verbakel, Jan Y
collection PubMed
description OBJECTIVE: Since 2008, the Oxford Diagnostic Horizon Scan Programme has been identifying and summarising evidence on new and emerging diagnostic technologies relevant to primary care. We used these reports to determine the sequence and timing of evidence for new point-of-care diagnostic tests and to identify common evidence gaps in this process. DESIGN: Systematic overview of diagnostic horizon scan reports. PRIMARY OUTCOME MEASURES: We obtained the primary studies referenced in each horizon scan report (n=40) and extracted details of the study size, clinical setting and design characteristics. In particular, we assessed whether each study evaluated test accuracy, test impact or cost-effectiveness. The evidence for each point-of-care test was mapped against the Horvath framework for diagnostic test evaluation. RESULTS: We extracted data from 500 primary studies. Most diagnostic technologies underwent clinical performance (ie, ability to detect a clinical condition) assessment (71.2%), with very few progressing to comparative clinical effectiveness (10.0%) and a cost-effectiveness evaluation (8.6%), even in the more established and frequently reported clinical domains, such as cardiovascular disease. The median time to complete an evaluation cycle was 9 years (IQR 5.5–12.5 years). The sequence of evidence generation was typically haphazard and some diagnostic tests appear to be implemented in routine care without completing essential evaluation stages such as clinical effectiveness. CONCLUSIONS: Evidence generation for new point-of-care diagnostic tests is slow and tends to focus on accuracy, and overlooks other test attributes such as impact, implementation and cost-effectiveness. Evaluation of this dynamic cycle and feeding back data from clinical effectiveness to refine analytical and clinical performance are key to improve the efficiency of point-of-care diagnostic test development and impact on clinically relevant outcomes. While the ‘road map’ for the steps needed to generate evidence are reasonably well delineated, we provide evidence on the complexity, length and variability of the actual process that many diagnostic technologies undergo.
format Online
Article
Text
id pubmed-5588931
institution National Center for Biotechnology Information
language English
publishDate 2017
publisher BMJ Publishing Group
record_format MEDLINE/PubMed
spelling pubmed-55889312017-09-14 Common evidence gaps in point-of-care diagnostic test evaluation: a review of horizon scan reports Verbakel, Jan Y Turner, Philip J Thompson, Matthew J Plüddemann, Annette Price, Christopher P Shinkins, Bethany Van den Bruel, Ann BMJ Open Diagnostics OBJECTIVE: Since 2008, the Oxford Diagnostic Horizon Scan Programme has been identifying and summarising evidence on new and emerging diagnostic technologies relevant to primary care. We used these reports to determine the sequence and timing of evidence for new point-of-care diagnostic tests and to identify common evidence gaps in this process. DESIGN: Systematic overview of diagnostic horizon scan reports. PRIMARY OUTCOME MEASURES: We obtained the primary studies referenced in each horizon scan report (n=40) and extracted details of the study size, clinical setting and design characteristics. In particular, we assessed whether each study evaluated test accuracy, test impact or cost-effectiveness. The evidence for each point-of-care test was mapped against the Horvath framework for diagnostic test evaluation. RESULTS: We extracted data from 500 primary studies. Most diagnostic technologies underwent clinical performance (ie, ability to detect a clinical condition) assessment (71.2%), with very few progressing to comparative clinical effectiveness (10.0%) and a cost-effectiveness evaluation (8.6%), even in the more established and frequently reported clinical domains, such as cardiovascular disease. The median time to complete an evaluation cycle was 9 years (IQR 5.5–12.5 years). The sequence of evidence generation was typically haphazard and some diagnostic tests appear to be implemented in routine care without completing essential evaluation stages such as clinical effectiveness. CONCLUSIONS: Evidence generation for new point-of-care diagnostic tests is slow and tends to focus on accuracy, and overlooks other test attributes such as impact, implementation and cost-effectiveness. Evaluation of this dynamic cycle and feeding back data from clinical effectiveness to refine analytical and clinical performance are key to improve the efficiency of point-of-care diagnostic test development and impact on clinically relevant outcomes. While the ‘road map’ for the steps needed to generate evidence are reasonably well delineated, we provide evidence on the complexity, length and variability of the actual process that many diagnostic technologies undergo. BMJ Publishing Group 2017-09-01 /pmc/articles/PMC5588931/ /pubmed/28864692 http://dx.doi.org/10.1136/bmjopen-2016-015760 Text en © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted. This is an Open Access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/
spellingShingle Diagnostics
Verbakel, Jan Y
Turner, Philip J
Thompson, Matthew J
Plüddemann, Annette
Price, Christopher P
Shinkins, Bethany
Van den Bruel, Ann
Common evidence gaps in point-of-care diagnostic test evaluation: a review of horizon scan reports
title Common evidence gaps in point-of-care diagnostic test evaluation: a review of horizon scan reports
title_full Common evidence gaps in point-of-care diagnostic test evaluation: a review of horizon scan reports
title_fullStr Common evidence gaps in point-of-care diagnostic test evaluation: a review of horizon scan reports
title_full_unstemmed Common evidence gaps in point-of-care diagnostic test evaluation: a review of horizon scan reports
title_short Common evidence gaps in point-of-care diagnostic test evaluation: a review of horizon scan reports
title_sort common evidence gaps in point-of-care diagnostic test evaluation: a review of horizon scan reports
topic Diagnostics
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5588931/
https://www.ncbi.nlm.nih.gov/pubmed/28864692
http://dx.doi.org/10.1136/bmjopen-2016-015760
work_keys_str_mv AT verbakeljany commonevidencegapsinpointofcarediagnostictestevaluationareviewofhorizonscanreports
AT turnerphilipj commonevidencegapsinpointofcarediagnostictestevaluationareviewofhorizonscanreports
AT thompsonmatthewj commonevidencegapsinpointofcarediagnostictestevaluationareviewofhorizonscanreports
AT pluddemannannette commonevidencegapsinpointofcarediagnostictestevaluationareviewofhorizonscanreports
AT pricechristopherp commonevidencegapsinpointofcarediagnostictestevaluationareviewofhorizonscanreports
AT shinkinsbethany commonevidencegapsinpointofcarediagnostictestevaluationareviewofhorizonscanreports
AT vandenbruelann commonevidencegapsinpointofcarediagnostictestevaluationareviewofhorizonscanreports