Cargando…

The Impact of Systematic Review Automation Tools on Methodological Quality and Time Taken to Complete Systematic Review Tasks: Case Study

BACKGROUND: Systematic reviews (SRs) are considered the highest level of evidence to answer research questions; however, they are time and resource intensive. OBJECTIVE: When comparing SR tasks done manually, using standard methods, versus those same SR tasks done using automated tools, (1) what is...

Descripción completa

Detalles Bibliográficos
Autores principales: Clark, Justin, McFarlane, Catherine, Cleo, Gina, Ishikawa Ramos, Christiane, Marshall, Skye
Formato: Online Artículo Texto
Lenguaje:English
Publicado: JMIR Publications 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8204237/
https://www.ncbi.nlm.nih.gov/pubmed/34057072
http://dx.doi.org/10.2196/24418
_version_ 1783708312909381632
author Clark, Justin
McFarlane, Catherine
Cleo, Gina
Ishikawa Ramos, Christiane
Marshall, Skye
author_facet Clark, Justin
McFarlane, Catherine
Cleo, Gina
Ishikawa Ramos, Christiane
Marshall, Skye
author_sort Clark, Justin
collection PubMed
description BACKGROUND: Systematic reviews (SRs) are considered the highest level of evidence to answer research questions; however, they are time and resource intensive. OBJECTIVE: When comparing SR tasks done manually, using standard methods, versus those same SR tasks done using automated tools, (1) what is the difference in time to complete the SR task and (2) what is the impact on the error rate of the SR task? METHODS: A case study compared specific tasks done during the conduct of an SR on prebiotic, probiotic, and synbiotic supplementation in chronic kidney disease. Two participants (manual team) conducted the SR using current methods, comprising a total of 16 tasks. Another two participants (automation team) conducted the tasks where a systematic review automation (SRA) tool was available, comprising of a total of six tasks. The time taken and error rate of the six tasks that were completed by both teams were compared. RESULTS: The approximate time for the manual team to produce a draft of the background, methods, and results sections of the SR was 126 hours. For the six tasks in which times were compared, the manual team spent 2493 minutes (42 hours) on the tasks, compared to 708 minutes (12 hours) spent by the automation team. The manual team had a higher error rate in two of the six tasks—regarding Task 5: Run the systematic search, the manual team made eight errors versus three errors made by the automation team; regarding Task 12: Assess the risk of bias, 25 assessments differed from a reference standard for the manual team compared to 20 differences for the automation team. The manual team had a lower error rate in one of the six tasks—regarding Task 6: Deduplicate search results, the manual team removed one unique study and missed zero duplicates versus the automation team who removed two unique studies and missed seven duplicates. Error rates were similar for the two remaining compared tasks—regarding Task 7: Screen the titles and abstracts and Task 9: Screen the full text, zero relevant studies were excluded by both teams. One task could not be compared between groups—Task 8: Find the full text. CONCLUSIONS: For the majority of SR tasks where an SRA tool was used, the time required to complete that task was reduced for novice researchers while methodological quality was maintained.
format Online
Article
Text
id pubmed-8204237
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher JMIR Publications
record_format MEDLINE/PubMed
spelling pubmed-82042372021-06-29 The Impact of Systematic Review Automation Tools on Methodological Quality and Time Taken to Complete Systematic Review Tasks: Case Study Clark, Justin McFarlane, Catherine Cleo, Gina Ishikawa Ramos, Christiane Marshall, Skye JMIR Med Educ Original Paper BACKGROUND: Systematic reviews (SRs) are considered the highest level of evidence to answer research questions; however, they are time and resource intensive. OBJECTIVE: When comparing SR tasks done manually, using standard methods, versus those same SR tasks done using automated tools, (1) what is the difference in time to complete the SR task and (2) what is the impact on the error rate of the SR task? METHODS: A case study compared specific tasks done during the conduct of an SR on prebiotic, probiotic, and synbiotic supplementation in chronic kidney disease. Two participants (manual team) conducted the SR using current methods, comprising a total of 16 tasks. Another two participants (automation team) conducted the tasks where a systematic review automation (SRA) tool was available, comprising of a total of six tasks. The time taken and error rate of the six tasks that were completed by both teams were compared. RESULTS: The approximate time for the manual team to produce a draft of the background, methods, and results sections of the SR was 126 hours. For the six tasks in which times were compared, the manual team spent 2493 minutes (42 hours) on the tasks, compared to 708 minutes (12 hours) spent by the automation team. The manual team had a higher error rate in two of the six tasks—regarding Task 5: Run the systematic search, the manual team made eight errors versus three errors made by the automation team; regarding Task 12: Assess the risk of bias, 25 assessments differed from a reference standard for the manual team compared to 20 differences for the automation team. The manual team had a lower error rate in one of the six tasks—regarding Task 6: Deduplicate search results, the manual team removed one unique study and missed zero duplicates versus the automation team who removed two unique studies and missed seven duplicates. Error rates were similar for the two remaining compared tasks—regarding Task 7: Screen the titles and abstracts and Task 9: Screen the full text, zero relevant studies were excluded by both teams. One task could not be compared between groups—Task 8: Find the full text. CONCLUSIONS: For the majority of SR tasks where an SRA tool was used, the time required to complete that task was reduced for novice researchers while methodological quality was maintained. JMIR Publications 2021-05-31 /pmc/articles/PMC8204237/ /pubmed/34057072 http://dx.doi.org/10.2196/24418 Text en ©Justin Clark, Catherine McFarlane, Gina Cleo, Christiane Ishikawa Ramos, Skye Marshall. Originally published in JMIR Medical Education (https://mededu.jmir.org), 31.05.2021. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in JMIR Medical Education, is properly cited. The complete bibliographic information, a link to the original publication on https://mededu.jmir.org/, as well as this copyright and license information must be included.
spellingShingle Original Paper
Clark, Justin
McFarlane, Catherine
Cleo, Gina
Ishikawa Ramos, Christiane
Marshall, Skye
The Impact of Systematic Review Automation Tools on Methodological Quality and Time Taken to Complete Systematic Review Tasks: Case Study
title The Impact of Systematic Review Automation Tools on Methodological Quality and Time Taken to Complete Systematic Review Tasks: Case Study
title_full The Impact of Systematic Review Automation Tools on Methodological Quality and Time Taken to Complete Systematic Review Tasks: Case Study
title_fullStr The Impact of Systematic Review Automation Tools on Methodological Quality and Time Taken to Complete Systematic Review Tasks: Case Study
title_full_unstemmed The Impact of Systematic Review Automation Tools on Methodological Quality and Time Taken to Complete Systematic Review Tasks: Case Study
title_short The Impact of Systematic Review Automation Tools on Methodological Quality and Time Taken to Complete Systematic Review Tasks: Case Study
title_sort impact of systematic review automation tools on methodological quality and time taken to complete systematic review tasks: case study
topic Original Paper
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8204237/
https://www.ncbi.nlm.nih.gov/pubmed/34057072
http://dx.doi.org/10.2196/24418
work_keys_str_mv AT clarkjustin theimpactofsystematicreviewautomationtoolsonmethodologicalqualityandtimetakentocompletesystematicreviewtaskscasestudy
AT mcfarlanecatherine theimpactofsystematicreviewautomationtoolsonmethodologicalqualityandtimetakentocompletesystematicreviewtaskscasestudy
AT cleogina theimpactofsystematicreviewautomationtoolsonmethodologicalqualityandtimetakentocompletesystematicreviewtaskscasestudy
AT ishikawaramoschristiane theimpactofsystematicreviewautomationtoolsonmethodologicalqualityandtimetakentocompletesystematicreviewtaskscasestudy
AT marshallskye theimpactofsystematicreviewautomationtoolsonmethodologicalqualityandtimetakentocompletesystematicreviewtaskscasestudy
AT clarkjustin impactofsystematicreviewautomationtoolsonmethodologicalqualityandtimetakentocompletesystematicreviewtaskscasestudy
AT mcfarlanecatherine impactofsystematicreviewautomationtoolsonmethodologicalqualityandtimetakentocompletesystematicreviewtaskscasestudy
AT cleogina impactofsystematicreviewautomationtoolsonmethodologicalqualityandtimetakentocompletesystematicreviewtaskscasestudy
AT ishikawaramoschristiane impactofsystematicreviewautomationtoolsonmethodologicalqualityandtimetakentocompletesystematicreviewtaskscasestudy
AT marshallskye impactofsystematicreviewautomationtoolsonmethodologicalqualityandtimetakentocompletesystematicreviewtaskscasestudy