Cargando…

Crowdsourcing citation-screening in a mixed-studies systematic review: a feasibility study

BACKGROUND: Crowdsourcing engages the help of large numbers of people in tasks, activities or projects, usually via the internet. One application of crowdsourcing is the screening of citations for inclusion in a systematic review. There is evidence that a ‘Crowd’ of non-specialists can reliably iden...

Descripción completa

Detalles Bibliográficos
Autores principales: Noel-Storr, Anna H., Redmond, Patrick, Lamé, Guillaume, Liberati, Elisa, Kelly, Sarah, Miller, Lucy, Dooley, Gordon, Paterson, Andy, Burt, Jenni
Formato: Online Artículo Texto
Lenguaje:English
Publicado: BioMed Central 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8077753/
https://www.ncbi.nlm.nih.gov/pubmed/33906604
http://dx.doi.org/10.1186/s12874-021-01271-4
_version_ 1783684938301702144
author Noel-Storr, Anna H.
Redmond, Patrick
Lamé, Guillaume
Liberati, Elisa
Kelly, Sarah
Miller, Lucy
Dooley, Gordon
Paterson, Andy
Burt, Jenni
author_facet Noel-Storr, Anna H.
Redmond, Patrick
Lamé, Guillaume
Liberati, Elisa
Kelly, Sarah
Miller, Lucy
Dooley, Gordon
Paterson, Andy
Burt, Jenni
author_sort Noel-Storr, Anna H.
collection PubMed
description BACKGROUND: Crowdsourcing engages the help of large numbers of people in tasks, activities or projects, usually via the internet. One application of crowdsourcing is the screening of citations for inclusion in a systematic review. There is evidence that a ‘Crowd’ of non-specialists can reliably identify quantitative studies, such as randomized controlled trials, through the assessment of study titles and abstracts. In this feasibility study, we investigated crowd performance of an online, topic-based citation-screening task, assessing titles and abstracts for inclusion in a single mixed-studies systematic review. METHODS: This study was embedded within a mixed studies systematic review of maternity care, exploring the effects of training healthcare professionals in intrapartum cardiotocography. Citation-screening was undertaken via Cochrane Crowd, an online citizen science platform enabling volunteers to contribute to a range of tasks identifying evidence in health and healthcare. Contributors were recruited from users registered with Cochrane Crowd. Following completion of task-specific online training, the crowd and the review team independently screened 9546 titles and abstracts. The screening task was subsequently repeated with a new crowd following minor changes to the crowd agreement algorithm based on findings from the first screening task. We assessed the crowd decisions against the review team categorizations (the ‘gold standard’), measuring sensitivity, specificity, time and task engagement. RESULTS: Seventy-eight crowd contributors completed the first screening task. Sensitivity (the crowd’s ability to correctly identify studies included within the review) was 84% (N = 42/50), and specificity (the crowd’s ability to correctly identify excluded studies) was 99% (N = 9373/9493). Task completion was 33 h for the crowd and 410 h for the review team; mean time to classify each record was 6.06 s for each crowd participant and 3.96 s for review team members. Replicating this task with 85 new contributors and an altered agreement algorithm found 94% sensitivity (N = 48/50) and 98% specificity (N = 9348/9493). Contributors reported positive experiences of the task. CONCLUSION: It might be feasible to recruit and train a crowd to accurately perform topic-based citation-screening for mixed studies systematic reviews, though resource expended on the necessary customised training required should be factored in. In the face of long review production times, crowd screening may enable a more time-efficient conduct of reviews, with minimal reduction of citation-screening accuracy, but further research is needed. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1186/s12874-021-01271-4.
format Online
Article
Text
id pubmed-8077753
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher BioMed Central
record_format MEDLINE/PubMed
spelling pubmed-80777532021-04-29 Crowdsourcing citation-screening in a mixed-studies systematic review: a feasibility study Noel-Storr, Anna H. Redmond, Patrick Lamé, Guillaume Liberati, Elisa Kelly, Sarah Miller, Lucy Dooley, Gordon Paterson, Andy Burt, Jenni BMC Med Res Methodol Research Article BACKGROUND: Crowdsourcing engages the help of large numbers of people in tasks, activities or projects, usually via the internet. One application of crowdsourcing is the screening of citations for inclusion in a systematic review. There is evidence that a ‘Crowd’ of non-specialists can reliably identify quantitative studies, such as randomized controlled trials, through the assessment of study titles and abstracts. In this feasibility study, we investigated crowd performance of an online, topic-based citation-screening task, assessing titles and abstracts for inclusion in a single mixed-studies systematic review. METHODS: This study was embedded within a mixed studies systematic review of maternity care, exploring the effects of training healthcare professionals in intrapartum cardiotocography. Citation-screening was undertaken via Cochrane Crowd, an online citizen science platform enabling volunteers to contribute to a range of tasks identifying evidence in health and healthcare. Contributors were recruited from users registered with Cochrane Crowd. Following completion of task-specific online training, the crowd and the review team independently screened 9546 titles and abstracts. The screening task was subsequently repeated with a new crowd following minor changes to the crowd agreement algorithm based on findings from the first screening task. We assessed the crowd decisions against the review team categorizations (the ‘gold standard’), measuring sensitivity, specificity, time and task engagement. RESULTS: Seventy-eight crowd contributors completed the first screening task. Sensitivity (the crowd’s ability to correctly identify studies included within the review) was 84% (N = 42/50), and specificity (the crowd’s ability to correctly identify excluded studies) was 99% (N = 9373/9493). Task completion was 33 h for the crowd and 410 h for the review team; mean time to classify each record was 6.06 s for each crowd participant and 3.96 s for review team members. Replicating this task with 85 new contributors and an altered agreement algorithm found 94% sensitivity (N = 48/50) and 98% specificity (N = 9348/9493). Contributors reported positive experiences of the task. CONCLUSION: It might be feasible to recruit and train a crowd to accurately perform topic-based citation-screening for mixed studies systematic reviews, though resource expended on the necessary customised training required should be factored in. In the face of long review production times, crowd screening may enable a more time-efficient conduct of reviews, with minimal reduction of citation-screening accuracy, but further research is needed. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1186/s12874-021-01271-4. BioMed Central 2021-04-26 /pmc/articles/PMC8077753/ /pubmed/33906604 http://dx.doi.org/10.1186/s12874-021-01271-4 Text en © The Author(s) 2021 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/ (https://creativecommons.org/publicdomain/zero/1.0/) ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.
spellingShingle Research Article
Noel-Storr, Anna H.
Redmond, Patrick
Lamé, Guillaume
Liberati, Elisa
Kelly, Sarah
Miller, Lucy
Dooley, Gordon
Paterson, Andy
Burt, Jenni
Crowdsourcing citation-screening in a mixed-studies systematic review: a feasibility study
title Crowdsourcing citation-screening in a mixed-studies systematic review: a feasibility study
title_full Crowdsourcing citation-screening in a mixed-studies systematic review: a feasibility study
title_fullStr Crowdsourcing citation-screening in a mixed-studies systematic review: a feasibility study
title_full_unstemmed Crowdsourcing citation-screening in a mixed-studies systematic review: a feasibility study
title_short Crowdsourcing citation-screening in a mixed-studies systematic review: a feasibility study
title_sort crowdsourcing citation-screening in a mixed-studies systematic review: a feasibility study
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8077753/
https://www.ncbi.nlm.nih.gov/pubmed/33906604
http://dx.doi.org/10.1186/s12874-021-01271-4
work_keys_str_mv AT noelstorrannah crowdsourcingcitationscreeninginamixedstudiessystematicreviewafeasibilitystudy
AT redmondpatrick crowdsourcingcitationscreeninginamixedstudiessystematicreviewafeasibilitystudy
AT lameguillaume crowdsourcingcitationscreeninginamixedstudiessystematicreviewafeasibilitystudy
AT liberatielisa crowdsourcingcitationscreeninginamixedstudiessystematicreviewafeasibilitystudy
AT kellysarah crowdsourcingcitationscreeninginamixedstudiessystematicreviewafeasibilitystudy
AT millerlucy crowdsourcingcitationscreeninginamixedstudiessystematicreviewafeasibilitystudy
AT dooleygordon crowdsourcingcitationscreeninginamixedstudiessystematicreviewafeasibilitystudy
AT patersonandy crowdsourcingcitationscreeninginamixedstudiessystematicreviewafeasibilitystudy
AT burtjenni crowdsourcingcitationscreeninginamixedstudiessystematicreviewafeasibilitystudy