Cargando…
Participation and Contribution in Crowdsourced Surveys
This paper identifies trends within and relationships between the amount of participation and the quality of contributions in three crowdsourced surveys. Participants were asked to perform a collective problem solving task that lacked any explicit incentive: they were instructed not only to respond...
Autores principales: | , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2015
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4383627/ https://www.ncbi.nlm.nih.gov/pubmed/25837602 http://dx.doi.org/10.1371/journal.pone.0120521 |
Sumario: | This paper identifies trends within and relationships between the amount of participation and the quality of contributions in three crowdsourced surveys. Participants were asked to perform a collective problem solving task that lacked any explicit incentive: they were instructed not only to respond to survey questions but also to pose new questions that they thought might-if responded to by others-predict an outcome variable of interest to them. While the three surveys had very different outcome variables, target audiences, methods of advertisement, and lengths of deployment, we found very similar patterns of collective behavior. In particular, we found that: the rate at which participants submitted new survey questions followed a heavy-tailed distribution; the distribution in the types of questions posed was similar; and many users posed non-obvious yet predictive questions. By analyzing responses to questions that contained a built-in range of valid response we found that less than 0.2% of responses lay outside of those ranges, indicating that most participants tend to respond honestly to surveys of this form, even without explicit incentives for honesty. While we did not find a significant relationship between the quantity of participation and the quality of contribution for both response submissions and question submissions, we did find several other more nuanced participant behavior patterns, which did correlate with contribution in one of the three surveys. We conclude that there exists an optimal time for users to pose questions early on in their participation, but only after they have submitted a few responses to other questions. This suggests that future crowdsourced surveys may attract more predictive questions by prompting users to pose new questions at specific times during their participation and limiting question submission at non-optimal times. |
---|