Cargando…

Considerations for conducting systematic reviews: evaluating the performance of different methods for de-duplicating references

BACKGROUND: Systematic reviews involve searching multiple bibliographic databases to identify eligible studies. As this type of evidence synthesis is increasingly pursued, the use of various electronic platforms can help researchers improve the efficiency and quality of their research. We examined t...

Descripción completa

Detalles Bibliográficos
Autores principales: McKeown, Sandra, Mir, Zuhaib M.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: BioMed Central 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7827976/
https://www.ncbi.nlm.nih.gov/pubmed/33485394
http://dx.doi.org/10.1186/s13643-021-01583-y
_version_ 1783640898061467648
author McKeown, Sandra
Mir, Zuhaib M.
author_facet McKeown, Sandra
Mir, Zuhaib M.
author_sort McKeown, Sandra
collection PubMed
description BACKGROUND: Systematic reviews involve searching multiple bibliographic databases to identify eligible studies. As this type of evidence synthesis is increasingly pursued, the use of various electronic platforms can help researchers improve the efficiency and quality of their research. We examined the accuracy and efficiency of commonly used electronic methods for flagging and removing duplicate references during this process. METHODS: A heterogeneous sample of references was obtained by conducting a similar topical search in MEDLINE, Embase, Cochrane Central Register of Controlled Trials, and PsycINFO databases. References were de-duplicated via manual abstraction to create a benchmark set. The default settings were then used in Ovid multifile search, EndNote desktop, Mendeley, Zotero, Covidence, and Rayyan to de-duplicate the sample of references independently. Using the benchmark set as reference, the number of false-negative and false-positive duplicate references for each method was identified, and accuracy, sensitivity, and specificity were determined. RESULTS: We found that the most accurate methods for identifying duplicate references were Ovid, Covidence, and Rayyan. Ovid and Covidence possessed the highest specificity for identifying duplicate references, while Rayyan demonstrated the highest sensitivity. CONCLUSION: This study reveals the strengths and weaknesses of commonly used de-duplication methods and provides strategies for improving their performance to avoid unintentionally removing eligible studies and introducing bias into systematic reviews. Along with availability, ease-of-use, functionality, and capability, these findings are important to consider when researchers are selecting database platforms and supporting software programs for conducting systematic reviews. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1186/s13643-021-01583-y.
format Online
Article
Text
id pubmed-7827976
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher BioMed Central
record_format MEDLINE/PubMed
spelling pubmed-78279762021-01-26 Considerations for conducting systematic reviews: evaluating the performance of different methods for de-duplicating references McKeown, Sandra Mir, Zuhaib M. Syst Rev Methodology BACKGROUND: Systematic reviews involve searching multiple bibliographic databases to identify eligible studies. As this type of evidence synthesis is increasingly pursued, the use of various electronic platforms can help researchers improve the efficiency and quality of their research. We examined the accuracy and efficiency of commonly used electronic methods for flagging and removing duplicate references during this process. METHODS: A heterogeneous sample of references was obtained by conducting a similar topical search in MEDLINE, Embase, Cochrane Central Register of Controlled Trials, and PsycINFO databases. References were de-duplicated via manual abstraction to create a benchmark set. The default settings were then used in Ovid multifile search, EndNote desktop, Mendeley, Zotero, Covidence, and Rayyan to de-duplicate the sample of references independently. Using the benchmark set as reference, the number of false-negative and false-positive duplicate references for each method was identified, and accuracy, sensitivity, and specificity were determined. RESULTS: We found that the most accurate methods for identifying duplicate references were Ovid, Covidence, and Rayyan. Ovid and Covidence possessed the highest specificity for identifying duplicate references, while Rayyan demonstrated the highest sensitivity. CONCLUSION: This study reveals the strengths and weaknesses of commonly used de-duplication methods and provides strategies for improving their performance to avoid unintentionally removing eligible studies and introducing bias into systematic reviews. Along with availability, ease-of-use, functionality, and capability, these findings are important to consider when researchers are selecting database platforms and supporting software programs for conducting systematic reviews. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1186/s13643-021-01583-y. BioMed Central 2021-01-23 /pmc/articles/PMC7827976/ /pubmed/33485394 http://dx.doi.org/10.1186/s13643-021-01583-y Text en © The Author(s) 2021 Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.
spellingShingle Methodology
McKeown, Sandra
Mir, Zuhaib M.
Considerations for conducting systematic reviews: evaluating the performance of different methods for de-duplicating references
title Considerations for conducting systematic reviews: evaluating the performance of different methods for de-duplicating references
title_full Considerations for conducting systematic reviews: evaluating the performance of different methods for de-duplicating references
title_fullStr Considerations for conducting systematic reviews: evaluating the performance of different methods for de-duplicating references
title_full_unstemmed Considerations for conducting systematic reviews: evaluating the performance of different methods for de-duplicating references
title_short Considerations for conducting systematic reviews: evaluating the performance of different methods for de-duplicating references
title_sort considerations for conducting systematic reviews: evaluating the performance of different methods for de-duplicating references
topic Methodology
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7827976/
https://www.ncbi.nlm.nih.gov/pubmed/33485394
http://dx.doi.org/10.1186/s13643-021-01583-y
work_keys_str_mv AT mckeownsandra considerationsforconductingsystematicreviewsevaluatingtheperformanceofdifferentmethodsfordeduplicatingreferences
AT mirzuhaibm considerationsforconductingsystematicreviewsevaluatingtheperformanceofdifferentmethodsfordeduplicatingreferences