Cargando…

Reporting quality in abstracts of meta-analyses of depression screening tool accuracy: a review of systematic reviews and meta-analyses

OBJECTIVE: Concerns have been raised regarding the quality and completeness of abstract reporting in evidence reviews, but this had not been evaluated in meta-analyses of diagnostic accuracy. Our objective was to evaluate reporting quality and completeness in abstracts of systematic reviews with met...

Descripción completa

Detalles Bibliográficos
Autores principales: Rice, Danielle B, Kloda, Lorie A, Shrier, Ian, Thombs, Brett D
Formato: Online Artículo Texto
Lenguaje:English
Publicado: BMJ Publishing Group 2016
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5128996/
https://www.ncbi.nlm.nih.gov/pubmed/27864250
http://dx.doi.org/10.1136/bmjopen-2016-012867
_version_ 1782470514097258496
author Rice, Danielle B
Kloda, Lorie A
Shrier, Ian
Thombs, Brett D
author_facet Rice, Danielle B
Kloda, Lorie A
Shrier, Ian
Thombs, Brett D
author_sort Rice, Danielle B
collection PubMed
description OBJECTIVE: Concerns have been raised regarding the quality and completeness of abstract reporting in evidence reviews, but this had not been evaluated in meta-analyses of diagnostic accuracy. Our objective was to evaluate reporting quality and completeness in abstracts of systematic reviews with meta-analyses of depression screening tool accuracy, using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) for Abstracts tool. DESIGN: Cross-sectional study. INCLUSION CRITERIA: We searched MEDLINE and PsycINFO from 1 January 2005 through 13 March 2016 for recent systematic reviews with meta-analyses in any language that compared a depression screening tool to a diagnosis based on clinical or validated diagnostic interview. DATA EXTRACTION: Two reviewers independently assessed quality and completeness of abstract reporting using the PRISMA for Abstracts tool with appropriate adaptations made for studies of diagnostic test accuracy. Bivariate associations of number of PRISMA for Abstracts items complied with (1) journal abstract word limit and (2) A Measurement Tool to Assess Systematic Reviews (AMSTAR) scores of meta-analyses were also assessed. RESULTS: We identified 21 eligible meta-analyses. Only two of 21 included meta-analyses complied with at least half of adapted PRISMA for Abstracts items. The majority met criteria for reporting an appropriate title (95%), result interpretation (95%) and synthesis of results (76%). Meta-analyses less consistently reported databases searched (43%), associated search dates (33%) and strengths and limitations of evidence (19%). Most meta-analyses did not adequately report a clinically meaningful description of outcomes (14%), risk of bias (14%), included study characteristics (10%), study eligibility criteria (5%), registration information (5%), clear objectives (0%), report eligibility criteria (0%) or funding (0%). Overall meta-analyses quality scores were significantly associated with the number of PRISMA for Abstracts scores items reported adequately (r=0.45). CONCLUSIONS: Quality and completeness of reporting were found to be suboptimal. Journal editors should endorse PRISMA for Abstracts and allow for flexibility in abstract word counts to improve quality of abstracts.
format Online
Article
Text
id pubmed-5128996
institution National Center for Biotechnology Information
language English
publishDate 2016
publisher BMJ Publishing Group
record_format MEDLINE/PubMed
spelling pubmed-51289962016-12-02 Reporting quality in abstracts of meta-analyses of depression screening tool accuracy: a review of systematic reviews and meta-analyses Rice, Danielle B Kloda, Lorie A Shrier, Ian Thombs, Brett D BMJ Open Diagnostics OBJECTIVE: Concerns have been raised regarding the quality and completeness of abstract reporting in evidence reviews, but this had not been evaluated in meta-analyses of diagnostic accuracy. Our objective was to evaluate reporting quality and completeness in abstracts of systematic reviews with meta-analyses of depression screening tool accuracy, using the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) for Abstracts tool. DESIGN: Cross-sectional study. INCLUSION CRITERIA: We searched MEDLINE and PsycINFO from 1 January 2005 through 13 March 2016 for recent systematic reviews with meta-analyses in any language that compared a depression screening tool to a diagnosis based on clinical or validated diagnostic interview. DATA EXTRACTION: Two reviewers independently assessed quality and completeness of abstract reporting using the PRISMA for Abstracts tool with appropriate adaptations made for studies of diagnostic test accuracy. Bivariate associations of number of PRISMA for Abstracts items complied with (1) journal abstract word limit and (2) A Measurement Tool to Assess Systematic Reviews (AMSTAR) scores of meta-analyses were also assessed. RESULTS: We identified 21 eligible meta-analyses. Only two of 21 included meta-analyses complied with at least half of adapted PRISMA for Abstracts items. The majority met criteria for reporting an appropriate title (95%), result interpretation (95%) and synthesis of results (76%). Meta-analyses less consistently reported databases searched (43%), associated search dates (33%) and strengths and limitations of evidence (19%). Most meta-analyses did not adequately report a clinically meaningful description of outcomes (14%), risk of bias (14%), included study characteristics (10%), study eligibility criteria (5%), registration information (5%), clear objectives (0%), report eligibility criteria (0%) or funding (0%). Overall meta-analyses quality scores were significantly associated with the number of PRISMA for Abstracts scores items reported adequately (r=0.45). CONCLUSIONS: Quality and completeness of reporting were found to be suboptimal. Journal editors should endorse PRISMA for Abstracts and allow for flexibility in abstract word counts to improve quality of abstracts. BMJ Publishing Group 2016-11-18 /pmc/articles/PMC5128996/ /pubmed/27864250 http://dx.doi.org/10.1136/bmjopen-2016-012867 Text en Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/ This is an Open Access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/
spellingShingle Diagnostics
Rice, Danielle B
Kloda, Lorie A
Shrier, Ian
Thombs, Brett D
Reporting quality in abstracts of meta-analyses of depression screening tool accuracy: a review of systematic reviews and meta-analyses
title Reporting quality in abstracts of meta-analyses of depression screening tool accuracy: a review of systematic reviews and meta-analyses
title_full Reporting quality in abstracts of meta-analyses of depression screening tool accuracy: a review of systematic reviews and meta-analyses
title_fullStr Reporting quality in abstracts of meta-analyses of depression screening tool accuracy: a review of systematic reviews and meta-analyses
title_full_unstemmed Reporting quality in abstracts of meta-analyses of depression screening tool accuracy: a review of systematic reviews and meta-analyses
title_short Reporting quality in abstracts of meta-analyses of depression screening tool accuracy: a review of systematic reviews and meta-analyses
title_sort reporting quality in abstracts of meta-analyses of depression screening tool accuracy: a review of systematic reviews and meta-analyses
topic Diagnostics
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5128996/
https://www.ncbi.nlm.nih.gov/pubmed/27864250
http://dx.doi.org/10.1136/bmjopen-2016-012867
work_keys_str_mv AT ricedanielleb reportingqualityinabstractsofmetaanalysesofdepressionscreeningtoolaccuracyareviewofsystematicreviewsandmetaanalyses
AT klodaloriea reportingqualityinabstractsofmetaanalysesofdepressionscreeningtoolaccuracyareviewofsystematicreviewsandmetaanalyses
AT shrierian reportingqualityinabstractsofmetaanalysesofdepressionscreeningtoolaccuracyareviewofsystematicreviewsandmetaanalyses
AT thombsbrettd reportingqualityinabstractsofmetaanalysesofdepressionscreeningtoolaccuracyareviewofsystematicreviewsandmetaanalyses