Cargando…

Participation and Contribution in Crowdsourced Surveys

This paper identifies trends within and relationships between the amount of participation and the quality of contributions in three crowdsourced surveys. Participants were asked to perform a collective problem solving task that lacked any explicit incentive: they were instructed not only to respond...

Descripción completa

Detalles Bibliográficos
Autores principales: Swain, Robert, Berger, Alex, Bongard, Josh, Hines, Paul
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2015
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4383627/
https://www.ncbi.nlm.nih.gov/pubmed/25837602
http://dx.doi.org/10.1371/journal.pone.0120521
_version_ 1782364777112141824
author Swain, Robert
Berger, Alex
Bongard, Josh
Hines, Paul
author_facet Swain, Robert
Berger, Alex
Bongard, Josh
Hines, Paul
author_sort Swain, Robert
collection PubMed
description This paper identifies trends within and relationships between the amount of participation and the quality of contributions in three crowdsourced surveys. Participants were asked to perform a collective problem solving task that lacked any explicit incentive: they were instructed not only to respond to survey questions but also to pose new questions that they thought might-if responded to by others-predict an outcome variable of interest to them. While the three surveys had very different outcome variables, target audiences, methods of advertisement, and lengths of deployment, we found very similar patterns of collective behavior. In particular, we found that: the rate at which participants submitted new survey questions followed a heavy-tailed distribution; the distribution in the types of questions posed was similar; and many users posed non-obvious yet predictive questions. By analyzing responses to questions that contained a built-in range of valid response we found that less than 0.2% of responses lay outside of those ranges, indicating that most participants tend to respond honestly to surveys of this form, even without explicit incentives for honesty. While we did not find a significant relationship between the quantity of participation and the quality of contribution for both response submissions and question submissions, we did find several other more nuanced participant behavior patterns, which did correlate with contribution in one of the three surveys. We conclude that there exists an optimal time for users to pose questions early on in their participation, but only after they have submitted a few responses to other questions. This suggests that future crowdsourced surveys may attract more predictive questions by prompting users to pose new questions at specific times during their participation and limiting question submission at non-optimal times.
format Online
Article
Text
id pubmed-4383627
institution National Center for Biotechnology Information
language English
publishDate 2015
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-43836272015-04-09 Participation and Contribution in Crowdsourced Surveys Swain, Robert Berger, Alex Bongard, Josh Hines, Paul PLoS One Research Article This paper identifies trends within and relationships between the amount of participation and the quality of contributions in three crowdsourced surveys. Participants were asked to perform a collective problem solving task that lacked any explicit incentive: they were instructed not only to respond to survey questions but also to pose new questions that they thought might-if responded to by others-predict an outcome variable of interest to them. While the three surveys had very different outcome variables, target audiences, methods of advertisement, and lengths of deployment, we found very similar patterns of collective behavior. In particular, we found that: the rate at which participants submitted new survey questions followed a heavy-tailed distribution; the distribution in the types of questions posed was similar; and many users posed non-obvious yet predictive questions. By analyzing responses to questions that contained a built-in range of valid response we found that less than 0.2% of responses lay outside of those ranges, indicating that most participants tend to respond honestly to surveys of this form, even without explicit incentives for honesty. While we did not find a significant relationship between the quantity of participation and the quality of contribution for both response submissions and question submissions, we did find several other more nuanced participant behavior patterns, which did correlate with contribution in one of the three surveys. We conclude that there exists an optimal time for users to pose questions early on in their participation, but only after they have submitted a few responses to other questions. This suggests that future crowdsourced surveys may attract more predictive questions by prompting users to pose new questions at specific times during their participation and limiting question submission at non-optimal times. Public Library of Science 2015-04-02 /pmc/articles/PMC4383627/ /pubmed/25837602 http://dx.doi.org/10.1371/journal.pone.0120521 Text en © 2015 Swain et al http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are properly credited.
spellingShingle Research Article
Swain, Robert
Berger, Alex
Bongard, Josh
Hines, Paul
Participation and Contribution in Crowdsourced Surveys
title Participation and Contribution in Crowdsourced Surveys
title_full Participation and Contribution in Crowdsourced Surveys
title_fullStr Participation and Contribution in Crowdsourced Surveys
title_full_unstemmed Participation and Contribution in Crowdsourced Surveys
title_short Participation and Contribution in Crowdsourced Surveys
title_sort participation and contribution in crowdsourced surveys
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4383627/
https://www.ncbi.nlm.nih.gov/pubmed/25837602
http://dx.doi.org/10.1371/journal.pone.0120521
work_keys_str_mv AT swainrobert participationandcontributionincrowdsourcedsurveys
AT bergeralex participationandcontributionincrowdsourcedsurveys
AT bongardjosh participationandcontributionincrowdsourcedsurveys
AT hinespaul participationandcontributionincrowdsourcedsurveys