Cargando…

Mapping of Crowdsourcing in Health: Systematic Review

BACKGROUND: Crowdsourcing involves obtaining ideas, needed services, or content by soliciting Web-based contributions from a crowd. The 4 types of crowdsourced tasks (problem solving, data processing, surveillance or monitoring, and surveying) can be applied in the 3 categories of health (promotion,...

Descripción completa

Detalles Bibliográficos
Autores principales: Créquit, Perrine, Mansouri, Ghizlène, Benchoufi, Mehdi, Vivot, Alexandre, Ravaud, Philippe
Formato: Online Artículo Texto
Lenguaje:English
Publicado: JMIR Publications 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5974463/
https://www.ncbi.nlm.nih.gov/pubmed/29764795
http://dx.doi.org/10.2196/jmir.9330
_version_ 1783326821267275776
author Créquit, Perrine
Mansouri, Ghizlène
Benchoufi, Mehdi
Vivot, Alexandre
Ravaud, Philippe
author_facet Créquit, Perrine
Mansouri, Ghizlène
Benchoufi, Mehdi
Vivot, Alexandre
Ravaud, Philippe
author_sort Créquit, Perrine
collection PubMed
description BACKGROUND: Crowdsourcing involves obtaining ideas, needed services, or content by soliciting Web-based contributions from a crowd. The 4 types of crowdsourced tasks (problem solving, data processing, surveillance or monitoring, and surveying) can be applied in the 3 categories of health (promotion, research, and care). OBJECTIVE: This study aimed to map the different applications of crowdsourcing in health to assess the fields of health that are using crowdsourcing and the crowdsourced tasks used. We also describe the logistics of crowdsourcing and the characteristics of crowd workers. METHODS: MEDLINE, EMBASE, and ClinicalTrials.gov were searched for available reports from inception to March 30, 2016, with no restriction on language or publication status. RESULTS: We identified 202 relevant studies that used crowdsourcing, including 9 randomized controlled trials, of which only one had posted results at ClinicalTrials.gov. Crowdsourcing was used in health promotion (91/202, 45.0%), research (73/202, 36.1%), and care (38/202, 18.8%). The 4 most frequent areas of application were public health (67/202, 33.2%), psychiatry (32/202, 15.8%), surgery (22/202, 10.9%), and oncology (14/202, 6.9%). Half of the reports (99/202, 49.0%) referred to data processing, 34.6% (70/202) referred to surveying, 10.4% (21/202) referred to surveillance or monitoring, and 5.9% (12/202) referred to problem-solving. Labor market platforms (eg, Amazon Mechanical Turk) were used in most studies (190/202, 94%). The crowd workers’ characteristics were poorly reported, and crowdsourcing logistics were missing from two-thirds of the reports. When reported, the median size of the crowd was 424 (first and third quartiles: 167-802); crowd workers’ median age was 34 years (32-36). Crowd workers were mainly recruited nationally, particularly in the United States. For many studies (58.9%, 119/202), previous experience in crowdsourcing was required, and passing a qualification test or training was seldom needed (11.9% of studies; 24/202). For half of the studies, monetary incentives were mentioned, with mainly less than US $1 to perform the task. The time needed to perform the task was mostly less than 10 min (58.9% of studies; 119/202). Data quality validation was used in 54/202 studies (26.7%), mainly by attention check questions or by replicating the task with several crowd workers. CONCLUSIONS: The use of crowdsourcing, which allows access to a large pool of participants as well as saving time in data collection, lowering costs, and speeding up innovations, is increasing in health promotion, research, and care. However, the description of crowdsourcing logistics and crowd workers’ characteristics is frequently missing in study reports and needs to be precisely reported to better interpret the study findings and replicate them.
format Online
Article
Text
id pubmed-5974463
institution National Center for Biotechnology Information
language English
publishDate 2018
publisher JMIR Publications
record_format MEDLINE/PubMed
spelling pubmed-59744632018-06-01 Mapping of Crowdsourcing in Health: Systematic Review Créquit, Perrine Mansouri, Ghizlène Benchoufi, Mehdi Vivot, Alexandre Ravaud, Philippe J Med Internet Res Review BACKGROUND: Crowdsourcing involves obtaining ideas, needed services, or content by soliciting Web-based contributions from a crowd. The 4 types of crowdsourced tasks (problem solving, data processing, surveillance or monitoring, and surveying) can be applied in the 3 categories of health (promotion, research, and care). OBJECTIVE: This study aimed to map the different applications of crowdsourcing in health to assess the fields of health that are using crowdsourcing and the crowdsourced tasks used. We also describe the logistics of crowdsourcing and the characteristics of crowd workers. METHODS: MEDLINE, EMBASE, and ClinicalTrials.gov were searched for available reports from inception to March 30, 2016, with no restriction on language or publication status. RESULTS: We identified 202 relevant studies that used crowdsourcing, including 9 randomized controlled trials, of which only one had posted results at ClinicalTrials.gov. Crowdsourcing was used in health promotion (91/202, 45.0%), research (73/202, 36.1%), and care (38/202, 18.8%). The 4 most frequent areas of application were public health (67/202, 33.2%), psychiatry (32/202, 15.8%), surgery (22/202, 10.9%), and oncology (14/202, 6.9%). Half of the reports (99/202, 49.0%) referred to data processing, 34.6% (70/202) referred to surveying, 10.4% (21/202) referred to surveillance or monitoring, and 5.9% (12/202) referred to problem-solving. Labor market platforms (eg, Amazon Mechanical Turk) were used in most studies (190/202, 94%). The crowd workers’ characteristics were poorly reported, and crowdsourcing logistics were missing from two-thirds of the reports. When reported, the median size of the crowd was 424 (first and third quartiles: 167-802); crowd workers’ median age was 34 years (32-36). Crowd workers were mainly recruited nationally, particularly in the United States. For many studies (58.9%, 119/202), previous experience in crowdsourcing was required, and passing a qualification test or training was seldom needed (11.9% of studies; 24/202). For half of the studies, monetary incentives were mentioned, with mainly less than US $1 to perform the task. The time needed to perform the task was mostly less than 10 min (58.9% of studies; 119/202). Data quality validation was used in 54/202 studies (26.7%), mainly by attention check questions or by replicating the task with several crowd workers. CONCLUSIONS: The use of crowdsourcing, which allows access to a large pool of participants as well as saving time in data collection, lowering costs, and speeding up innovations, is increasing in health promotion, research, and care. However, the description of crowdsourcing logistics and crowd workers’ characteristics is frequently missing in study reports and needs to be precisely reported to better interpret the study findings and replicate them. JMIR Publications 2018-05-15 /pmc/articles/PMC5974463/ /pubmed/29764795 http://dx.doi.org/10.2196/jmir.9330 Text en ©Perrine Créquit, Ghizlène Mansouri, Mehdi Benchoufi, Alexandre Vivot, Philippe Ravaud. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 15.05.2018. https://creativecommons.org/licenses/by/4.0/This is an open-access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work, first published in the Journal of Medical Internet Research, is properly cited. The complete bibliographic information, a link to the original publication on http://www.jmir.org/, as well as this copyright and license information must be included.
spellingShingle Review
Créquit, Perrine
Mansouri, Ghizlène
Benchoufi, Mehdi
Vivot, Alexandre
Ravaud, Philippe
Mapping of Crowdsourcing in Health: Systematic Review
title Mapping of Crowdsourcing in Health: Systematic Review
title_full Mapping of Crowdsourcing in Health: Systematic Review
title_fullStr Mapping of Crowdsourcing in Health: Systematic Review
title_full_unstemmed Mapping of Crowdsourcing in Health: Systematic Review
title_short Mapping of Crowdsourcing in Health: Systematic Review
title_sort mapping of crowdsourcing in health: systematic review
topic Review
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5974463/
https://www.ncbi.nlm.nih.gov/pubmed/29764795
http://dx.doi.org/10.2196/jmir.9330
work_keys_str_mv AT crequitperrine mappingofcrowdsourcinginhealthsystematicreview
AT mansourighizlene mappingofcrowdsourcinginhealthsystematicreview
AT benchoufimehdi mappingofcrowdsourcinginhealthsystematicreview
AT vivotalexandre mappingofcrowdsourcinginhealthsystematicreview
AT ravaudphilippe mappingofcrowdsourcinginhealthsystematicreview