Cargando…

Assessment of reporting quality of abstracts of systematic reviews with meta-analysis using PRISMA-A and discordance in assessments between raters without prior experience

BACKGROUND: Reporting quality of systematic reviews’ (SRs) abstracts is important because this is often the only information about a study that readers have. The aim of this study was to assess adherence of SR abstracts in the field of anesthesiology with the reporting checklist PRISMA extension for...

Descripción completa

Detalles Bibliográficos
Autores principales: Maticic, Katarina, Krnic Martinic, Marina, Puljak, Livia
Formato: Online Artículo Texto
Lenguaje:English
Publicado: BioMed Central 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6376734/
https://www.ncbi.nlm.nih.gov/pubmed/30764774
http://dx.doi.org/10.1186/s12874-019-0675-2
_version_ 1783395625509847040
author Maticic, Katarina
Krnic Martinic, Marina
Puljak, Livia
author_facet Maticic, Katarina
Krnic Martinic, Marina
Puljak, Livia
author_sort Maticic, Katarina
collection PubMed
description BACKGROUND: Reporting quality of systematic reviews’ (SRs) abstracts is important because this is often the only information about a study that readers have. The aim of this study was to assess adherence of SR abstracts in the field of anesthesiology with the reporting checklist PRISMA extension for Abstracts (PRISMA-A) and to analyze to what extent will the use of PRISMA-A yield concordant ratings in two raters without prior experience with the checklist. METHODS: We analyzed reporting quality of SRs with meta-analysis of randomized controlled trials of interventions published in the field of anesthesiology from 2012 to 2016 by using 12-item PRISMA-A checklist. After calibration exercise, two authors without prior experience with PRISMA-A scored the abstracts. Primary outcome was median adherence to PRISMA-A checklist. Secondary outcome was adherence to individual items of the checklist. We analyzed whether there was improvement in reporting of SR abstracts over time. Additionally, we analyzed discrepancies between the two raters in scoring individual PRISMA-A items. RESULTS: Our search yielded 318 results, of which we included 244 SRs. Median adherence to PRISMA-A checklist was 42% (5 items of 12). The majority of analyzed SR abstracts (N = 148, 61%) had a total adherence score under 50%, and not a single one had adherence above 75%. Adherence to individual items was very variable, ranging from 0% for reporting SR funding, to 97% for interpreting SR findings. Overall adherence to PRISMA-A did not change over the analyzed 5 years before and after publication of PRISMA-A in 2013. Even after calibration exercise, discrepancies between the two raters were found in 275 (9.3%) out of 2928 analyzed PRISMA-A items. Cohen’s Kappa was 0.807. In the item about the description of effect there were discrepancies in 59% of the abstracts between the raters. CONCLUSION: Reporting quality of systematic review abstracts in the field of anesthesiology is suboptimal, and did not improve after publication of PRISMA-A checklist in 2013. We need stricter adherence to reporting checklists by authors, editors and peer-reviewers, and interventions that will help those stakeholders to improve reporting of systematic reviews. Some items of PRISMA-A checklist are difficult to score.
format Online
Article
Text
id pubmed-6376734
institution National Center for Biotechnology Information
language English
publishDate 2019
publisher BioMed Central
record_format MEDLINE/PubMed
spelling pubmed-63767342019-02-27 Assessment of reporting quality of abstracts of systematic reviews with meta-analysis using PRISMA-A and discordance in assessments between raters without prior experience Maticic, Katarina Krnic Martinic, Marina Puljak, Livia BMC Med Res Methodol Research Article BACKGROUND: Reporting quality of systematic reviews’ (SRs) abstracts is important because this is often the only information about a study that readers have. The aim of this study was to assess adherence of SR abstracts in the field of anesthesiology with the reporting checklist PRISMA extension for Abstracts (PRISMA-A) and to analyze to what extent will the use of PRISMA-A yield concordant ratings in two raters without prior experience with the checklist. METHODS: We analyzed reporting quality of SRs with meta-analysis of randomized controlled trials of interventions published in the field of anesthesiology from 2012 to 2016 by using 12-item PRISMA-A checklist. After calibration exercise, two authors without prior experience with PRISMA-A scored the abstracts. Primary outcome was median adherence to PRISMA-A checklist. Secondary outcome was adherence to individual items of the checklist. We analyzed whether there was improvement in reporting of SR abstracts over time. Additionally, we analyzed discrepancies between the two raters in scoring individual PRISMA-A items. RESULTS: Our search yielded 318 results, of which we included 244 SRs. Median adherence to PRISMA-A checklist was 42% (5 items of 12). The majority of analyzed SR abstracts (N = 148, 61%) had a total adherence score under 50%, and not a single one had adherence above 75%. Adherence to individual items was very variable, ranging from 0% for reporting SR funding, to 97% for interpreting SR findings. Overall adherence to PRISMA-A did not change over the analyzed 5 years before and after publication of PRISMA-A in 2013. Even after calibration exercise, discrepancies between the two raters were found in 275 (9.3%) out of 2928 analyzed PRISMA-A items. Cohen’s Kappa was 0.807. In the item about the description of effect there were discrepancies in 59% of the abstracts between the raters. CONCLUSION: Reporting quality of systematic review abstracts in the field of anesthesiology is suboptimal, and did not improve after publication of PRISMA-A checklist in 2013. We need stricter adherence to reporting checklists by authors, editors and peer-reviewers, and interventions that will help those stakeholders to improve reporting of systematic reviews. Some items of PRISMA-A checklist are difficult to score. BioMed Central 2019-02-14 /pmc/articles/PMC6376734/ /pubmed/30764774 http://dx.doi.org/10.1186/s12874-019-0675-2 Text en © The Author(s). 2019 Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.
spellingShingle Research Article
Maticic, Katarina
Krnic Martinic, Marina
Puljak, Livia
Assessment of reporting quality of abstracts of systematic reviews with meta-analysis using PRISMA-A and discordance in assessments between raters without prior experience
title Assessment of reporting quality of abstracts of systematic reviews with meta-analysis using PRISMA-A and discordance in assessments between raters without prior experience
title_full Assessment of reporting quality of abstracts of systematic reviews with meta-analysis using PRISMA-A and discordance in assessments between raters without prior experience
title_fullStr Assessment of reporting quality of abstracts of systematic reviews with meta-analysis using PRISMA-A and discordance in assessments between raters without prior experience
title_full_unstemmed Assessment of reporting quality of abstracts of systematic reviews with meta-analysis using PRISMA-A and discordance in assessments between raters without prior experience
title_short Assessment of reporting quality of abstracts of systematic reviews with meta-analysis using PRISMA-A and discordance in assessments between raters without prior experience
title_sort assessment of reporting quality of abstracts of systematic reviews with meta-analysis using prisma-a and discordance in assessments between raters without prior experience
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6376734/
https://www.ncbi.nlm.nih.gov/pubmed/30764774
http://dx.doi.org/10.1186/s12874-019-0675-2
work_keys_str_mv AT maticickatarina assessmentofreportingqualityofabstractsofsystematicreviewswithmetaanalysisusingprismaaanddiscordanceinassessmentsbetweenraterswithoutpriorexperience
AT krnicmartinicmarina assessmentofreportingqualityofabstractsofsystematicreviewswithmetaanalysisusingprismaaanddiscordanceinassessmentsbetweenraterswithoutpriorexperience
AT puljaklivia assessmentofreportingqualityofabstractsofsystematicreviewswithmetaanalysisusingprismaaanddiscordanceinassessmentsbetweenraterswithoutpriorexperience