Cargando…

Guidance to Best Tools and Practices for Systematic Reviews

». Data continue to accumulate indicating that many systematic reviews are methodologically flawed, biased, redundant, or uninformative. Some improvements have occurred in recent years based on empirical methods research and standardization of appraisal tools; however, many authors do not routinely...

Descripción completa

Detalles Bibliográficos
Autores principales: Kolaski, Kat, Logan, Lynne Romeiser, Ioannidis, John P.A.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Journal of Bone and Joint Surgery, Inc. 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10259219/
https://www.ncbi.nlm.nih.gov/pubmed/37285444
http://dx.doi.org/10.2106/JBJS.RVW.23.00077
_version_ 1785057619074875392
author Kolaski, Kat
Logan, Lynne Romeiser
Ioannidis, John P.A.
author_facet Kolaski, Kat
Logan, Lynne Romeiser
Ioannidis, John P.A.
author_sort Kolaski, Kat
collection PubMed
description ». Data continue to accumulate indicating that many systematic reviews are methodologically flawed, biased, redundant, or uninformative. Some improvements have occurred in recent years based on empirical methods research and standardization of appraisal tools; however, many authors do not routinely or consistently apply these updated methods. In addition, guideline developers, peer reviewers, and journal editors often disregard current methodological standards. Although extensively acknowledged and explored in the methodological literature, most clinicians seem unaware of these issues and may automatically accept evidence syntheses (and clinical practice guidelines based on their conclusions) as trustworthy. ». A plethora of methods and tools are recommended for the development and evaluation of evidence syntheses. It is important to understand what these are intended to do (and cannot do) and how they can be utilized. Our objective is to distill this sprawling information into a format that is understandable and readily accessible to authors, peer reviewers, and editors. In doing so, we aim to promote appreciation and understanding of the demanding science of evidence synthesis among stakeholders. We focus on well-documented deficiencies in key components of evidence syntheses to elucidate the rationale for current standards. The constructs underlying the tools developed to assess reporting, risk of bias, and methodological quality of evidence syntheses are distinguished from those involved in determining overall certainty of a body of evidence. Another important distinction is made between those tools used by authors to develop their syntheses as opposed to those used to ultimately judge their work. ». Exemplar methods and research practices are described, complemented by novel pragmatic strategies to improve evidence syntheses. The latter include preferred terminology and a scheme to characterize types of research evidence. We organize best practice resources in a Concise Guide that can be widely adopted and adapted for routine implementation by authors and journals. Appropriate, informed use of these is encouraged, but we caution against their superficial application and emphasize their endorsement does not substitute for in-depth methodological training. By highlighting best practices with their rationale, we hope this guidance will inspire further evolution of methods and tools that can advance the field.
format Online
Article
Text
id pubmed-10259219
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Journal of Bone and Joint Surgery, Inc.
record_format MEDLINE/PubMed
spelling pubmed-102592192023-06-13 Guidance to Best Tools and Practices for Systematic Reviews Kolaski, Kat Logan, Lynne Romeiser Ioannidis, John P.A. JBJS Rev Editorial ». Data continue to accumulate indicating that many systematic reviews are methodologically flawed, biased, redundant, or uninformative. Some improvements have occurred in recent years based on empirical methods research and standardization of appraisal tools; however, many authors do not routinely or consistently apply these updated methods. In addition, guideline developers, peer reviewers, and journal editors often disregard current methodological standards. Although extensively acknowledged and explored in the methodological literature, most clinicians seem unaware of these issues and may automatically accept evidence syntheses (and clinical practice guidelines based on their conclusions) as trustworthy. ». A plethora of methods and tools are recommended for the development and evaluation of evidence syntheses. It is important to understand what these are intended to do (and cannot do) and how they can be utilized. Our objective is to distill this sprawling information into a format that is understandable and readily accessible to authors, peer reviewers, and editors. In doing so, we aim to promote appreciation and understanding of the demanding science of evidence synthesis among stakeholders. We focus on well-documented deficiencies in key components of evidence syntheses to elucidate the rationale for current standards. The constructs underlying the tools developed to assess reporting, risk of bias, and methodological quality of evidence syntheses are distinguished from those involved in determining overall certainty of a body of evidence. Another important distinction is made between those tools used by authors to develop their syntheses as opposed to those used to ultimately judge their work. ». Exemplar methods and research practices are described, complemented by novel pragmatic strategies to improve evidence syntheses. The latter include preferred terminology and a scheme to characterize types of research evidence. We organize best practice resources in a Concise Guide that can be widely adopted and adapted for routine implementation by authors and journals. Appropriate, informed use of these is encouraged, but we caution against their superficial application and emphasize their endorsement does not substitute for in-depth methodological training. By highlighting best practices with their rationale, we hope this guidance will inspire further evolution of methods and tools that can advance the field. Journal of Bone and Joint Surgery, Inc. 2023-06-07 /pmc/articles/PMC10259219/ /pubmed/37285444 http://dx.doi.org/10.2106/JBJS.RVW.23.00077 Text en Copyright © 2023 The Authors. Published by The Journal of Bone and Joint Surgery, Incorporated. All rights reserved. https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the Creative Commons Attribution License 4.0 (https://creativecommons.org/licenses/by/4.0/) (CCBY), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
spellingShingle Editorial
Kolaski, Kat
Logan, Lynne Romeiser
Ioannidis, John P.A.
Guidance to Best Tools and Practices for Systematic Reviews
title Guidance to Best Tools and Practices for Systematic Reviews
title_full Guidance to Best Tools and Practices for Systematic Reviews
title_fullStr Guidance to Best Tools and Practices for Systematic Reviews
title_full_unstemmed Guidance to Best Tools and Practices for Systematic Reviews
title_short Guidance to Best Tools and Practices for Systematic Reviews
title_sort guidance to best tools and practices for systematic reviews
topic Editorial
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10259219/
https://www.ncbi.nlm.nih.gov/pubmed/37285444
http://dx.doi.org/10.2106/JBJS.RVW.23.00077
work_keys_str_mv AT kolaskikat guidancetobesttoolsandpracticesforsystematicreviews
AT loganlynneromeiser guidancetobesttoolsandpracticesforsystematicreviews
AT ioannidisjohnpa guidancetobesttoolsandpracticesforsystematicreviews