Cargando…

Assessment of the abstract reporting of systematic reviews of dose-response meta-analysis: a literature survey

BACKGROUND: There is an increasing number of published systematic reviews (SR) of dose-response meta-analyses (DRMAs) over the past decades. However, the quality of abstract reporting of these SR-DRMAs remains to be understood. We conducted a literature survey to investigate the abstract reporting o...

Descripción completa

Detalles Bibliográficos
Autores principales: Jia, Peng-Li, Xu, Bin, Cheng, Jing-Min, Huang, Xi-Hao, Kwong, Joey S. W., Liu, Yu, Zhang, Chao, Han, Ying, Xu, Chang
Formato: Online Artículo Texto
Lenguaje:English
Publicado: BioMed Central 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6631883/
https://www.ncbi.nlm.nih.gov/pubmed/31307388
http://dx.doi.org/10.1186/s12874-019-0798-5
_version_ 1783435622918127616
author Jia, Peng-Li
Xu, Bin
Cheng, Jing-Min
Huang, Xi-Hao
Kwong, Joey S. W.
Liu, Yu
Zhang, Chao
Han, Ying
Xu, Chang
author_facet Jia, Peng-Li
Xu, Bin
Cheng, Jing-Min
Huang, Xi-Hao
Kwong, Joey S. W.
Liu, Yu
Zhang, Chao
Han, Ying
Xu, Chang
author_sort Jia, Peng-Li
collection PubMed
description BACKGROUND: There is an increasing number of published systematic reviews (SR) of dose-response meta-analyses (DRMAs) over the past decades. However, the quality of abstract reporting of these SR-DRMAs remains to be understood. We conducted a literature survey to investigate the abstract reporting of SR-DRMAs. METHODS: Medline, Embase, and Wiley online Library were searched for eligible SR-DRMAs. The reporting quality of SR-DRMAs was assessed by the modified PRISMA-for-Abstract checklist (14 items). We summarized the adherence rate of each item and categorized them as well complied (adhered by 80% or above), moderately complied (50 to 79%), and poorly complied (less than 50%). We used total score to reflect the abstract quality and regression analysis was employed to explore the potential influence factors for it. RESULTS: We included 529 SR-DRMAs. Eight of 14 items were moderately (3 items) or poorly complied (5 items) while only 6 were well complied by these SR-DRMAs. Most of the SR-DRMAs failed to describe the methods for risk of bias assessment (30.2, 95% CI: 26.4, 34.4%) and the results of bias assessment (48.8, 95% CI: 44.4, 53.1%). Few SR-DRMAs reported the funding (2.3, 95% CI: 1.2, 3.9%) and registration (0.6, 95% CI: 0.1, 1.6%) information in the abstract. Multivariable regression analysis suggested word number of abstracts [> 250 vs. ≤ 250 (estimated ß = 0.31; 95% CI: 0.02, 0.61; P = 0.039)] was positively associated with the abstract reporting quality. CONCLUSION: The abstract reporting of SR-DRMAs is suboptimal, substantial effort is needed to improve the reporting. More word number may benefit for the abstract reporting. Given that reporting of abstract largely depends on the reporting and conduct of the SR-DRMA, review authors should also focus on the completeness of SR-DRMA itself. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (10.1186/s12874-019-0798-5) contains supplementary material, which is available to authorized users.
format Online
Article
Text
id pubmed-6631883
institution National Center for Biotechnology Information
language English
publishDate 2019
publisher BioMed Central
record_format MEDLINE/PubMed
spelling pubmed-66318832019-07-24 Assessment of the abstract reporting of systematic reviews of dose-response meta-analysis: a literature survey Jia, Peng-Li Xu, Bin Cheng, Jing-Min Huang, Xi-Hao Kwong, Joey S. W. Liu, Yu Zhang, Chao Han, Ying Xu, Chang BMC Med Res Methodol Research Article BACKGROUND: There is an increasing number of published systematic reviews (SR) of dose-response meta-analyses (DRMAs) over the past decades. However, the quality of abstract reporting of these SR-DRMAs remains to be understood. We conducted a literature survey to investigate the abstract reporting of SR-DRMAs. METHODS: Medline, Embase, and Wiley online Library were searched for eligible SR-DRMAs. The reporting quality of SR-DRMAs was assessed by the modified PRISMA-for-Abstract checklist (14 items). We summarized the adherence rate of each item and categorized them as well complied (adhered by 80% or above), moderately complied (50 to 79%), and poorly complied (less than 50%). We used total score to reflect the abstract quality and regression analysis was employed to explore the potential influence factors for it. RESULTS: We included 529 SR-DRMAs. Eight of 14 items were moderately (3 items) or poorly complied (5 items) while only 6 were well complied by these SR-DRMAs. Most of the SR-DRMAs failed to describe the methods for risk of bias assessment (30.2, 95% CI: 26.4, 34.4%) and the results of bias assessment (48.8, 95% CI: 44.4, 53.1%). Few SR-DRMAs reported the funding (2.3, 95% CI: 1.2, 3.9%) and registration (0.6, 95% CI: 0.1, 1.6%) information in the abstract. Multivariable regression analysis suggested word number of abstracts [> 250 vs. ≤ 250 (estimated ß = 0.31; 95% CI: 0.02, 0.61; P = 0.039)] was positively associated with the abstract reporting quality. CONCLUSION: The abstract reporting of SR-DRMAs is suboptimal, substantial effort is needed to improve the reporting. More word number may benefit for the abstract reporting. Given that reporting of abstract largely depends on the reporting and conduct of the SR-DRMA, review authors should also focus on the completeness of SR-DRMA itself. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (10.1186/s12874-019-0798-5) contains supplementary material, which is available to authorized users. BioMed Central 2019-07-15 /pmc/articles/PMC6631883/ /pubmed/31307388 http://dx.doi.org/10.1186/s12874-019-0798-5 Text en © The Author(s). 2019 Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.
spellingShingle Research Article
Jia, Peng-Li
Xu, Bin
Cheng, Jing-Min
Huang, Xi-Hao
Kwong, Joey S. W.
Liu, Yu
Zhang, Chao
Han, Ying
Xu, Chang
Assessment of the abstract reporting of systematic reviews of dose-response meta-analysis: a literature survey
title Assessment of the abstract reporting of systematic reviews of dose-response meta-analysis: a literature survey
title_full Assessment of the abstract reporting of systematic reviews of dose-response meta-analysis: a literature survey
title_fullStr Assessment of the abstract reporting of systematic reviews of dose-response meta-analysis: a literature survey
title_full_unstemmed Assessment of the abstract reporting of systematic reviews of dose-response meta-analysis: a literature survey
title_short Assessment of the abstract reporting of systematic reviews of dose-response meta-analysis: a literature survey
title_sort assessment of the abstract reporting of systematic reviews of dose-response meta-analysis: a literature survey
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6631883/
https://www.ncbi.nlm.nih.gov/pubmed/31307388
http://dx.doi.org/10.1186/s12874-019-0798-5
work_keys_str_mv AT jiapengli assessmentoftheabstractreportingofsystematicreviewsofdoseresponsemetaanalysisaliteraturesurvey
AT xubin assessmentoftheabstractreportingofsystematicreviewsofdoseresponsemetaanalysisaliteraturesurvey
AT chengjingmin assessmentoftheabstractreportingofsystematicreviewsofdoseresponsemetaanalysisaliteraturesurvey
AT huangxihao assessmentoftheabstractreportingofsystematicreviewsofdoseresponsemetaanalysisaliteraturesurvey
AT kwongjoeysw assessmentoftheabstractreportingofsystematicreviewsofdoseresponsemetaanalysisaliteraturesurvey
AT liuyu assessmentoftheabstractreportingofsystematicreviewsofdoseresponsemetaanalysisaliteraturesurvey
AT zhangchao assessmentoftheabstractreportingofsystematicreviewsofdoseresponsemetaanalysisaliteraturesurvey
AT hanying assessmentoftheabstractreportingofsystematicreviewsofdoseresponsemetaanalysisaliteraturesurvey
AT xuchang assessmentoftheabstractreportingofsystematicreviewsofdoseresponsemetaanalysisaliteraturesurvey