Cargando…
Lessons Learned from Crowdsourcing Complex Engineering Tasks
CROWDSOURCING: Crowdsourcing is the practice of obtaining needed ideas, services, or content by requesting contributions from a large group of people. Amazon Mechanical Turk is a web marketplace for crowdsourcing microtasks, such as answering surveys and image tagging. We explored the limits of crow...
Autores principales: | , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2015
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4575153/ https://www.ncbi.nlm.nih.gov/pubmed/26383029 http://dx.doi.org/10.1371/journal.pone.0134978 |
_version_ | 1782390739073761280 |
---|---|
author | Staffelbach, Matthew Sempolinski, Peter Kijewski-Correa, Tracy Thain, Douglas Wei, Daniel Kareem, Ahsan Madey, Gregory |
author_facet | Staffelbach, Matthew Sempolinski, Peter Kijewski-Correa, Tracy Thain, Douglas Wei, Daniel Kareem, Ahsan Madey, Gregory |
author_sort | Staffelbach, Matthew |
collection | PubMed |
description | CROWDSOURCING: Crowdsourcing is the practice of obtaining needed ideas, services, or content by requesting contributions from a large group of people. Amazon Mechanical Turk is a web marketplace for crowdsourcing microtasks, such as answering surveys and image tagging. We explored the limits of crowdsourcing by using Mechanical Turk for a more complicated task: analysis and creation of wind simulations. HARNESSING CROWDWORKERS FOR ENGINEERING: Our investigation examined the feasibility of using crowdsourcing for complex, highly technical tasks. This was done to determine if the benefits of crowdsourcing could be harnessed to accurately and effectively contribute to solving complex real world engineering problems. Of course, untrained crowds cannot be used as a mere substitute for trained expertise. Rather, we sought to understand how crowd workers can be used as a large pool of labor for a preliminary analysis of complex data. VIRTUAL WIND TUNNEL: We compared the skill of the anonymous crowd workers from Amazon Mechanical Turk with that of civil engineering graduate students, making a first pass at analyzing wind simulation data. For the first phase, we posted analysis questions to Amazon crowd workers and to two groups of civil engineering graduate students. A second phase of our experiment instructed crowd workers and students to create simulations on our Virtual Wind Tunnel website to solve a more complex task. CONCLUSIONS: With a sufficiently comprehensive tutorial and compensation similar to typical crowd-sourcing wages, we were able to enlist crowd workers to effectively complete longer, more complex tasks with competence comparable to that of graduate students with more comprehensive, expert-level knowledge. Furthermore, more complex tasks require increased communication with the workers. As tasks become more complex, the employment relationship begins to become more akin to outsourcing than crowdsourcing. Through this investigation, we were able to stretch and explore the limits of crowdsourcing as a tool for solving complex problems. |
format | Online Article Text |
id | pubmed-4575153 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2015 |
publisher | Public Library of Science |
record_format | MEDLINE/PubMed |
spelling | pubmed-45751532015-09-25 Lessons Learned from Crowdsourcing Complex Engineering Tasks Staffelbach, Matthew Sempolinski, Peter Kijewski-Correa, Tracy Thain, Douglas Wei, Daniel Kareem, Ahsan Madey, Gregory PLoS One Research Article CROWDSOURCING: Crowdsourcing is the practice of obtaining needed ideas, services, or content by requesting contributions from a large group of people. Amazon Mechanical Turk is a web marketplace for crowdsourcing microtasks, such as answering surveys and image tagging. We explored the limits of crowdsourcing by using Mechanical Turk for a more complicated task: analysis and creation of wind simulations. HARNESSING CROWDWORKERS FOR ENGINEERING: Our investigation examined the feasibility of using crowdsourcing for complex, highly technical tasks. This was done to determine if the benefits of crowdsourcing could be harnessed to accurately and effectively contribute to solving complex real world engineering problems. Of course, untrained crowds cannot be used as a mere substitute for trained expertise. Rather, we sought to understand how crowd workers can be used as a large pool of labor for a preliminary analysis of complex data. VIRTUAL WIND TUNNEL: We compared the skill of the anonymous crowd workers from Amazon Mechanical Turk with that of civil engineering graduate students, making a first pass at analyzing wind simulation data. For the first phase, we posted analysis questions to Amazon crowd workers and to two groups of civil engineering graduate students. A second phase of our experiment instructed crowd workers and students to create simulations on our Virtual Wind Tunnel website to solve a more complex task. CONCLUSIONS: With a sufficiently comprehensive tutorial and compensation similar to typical crowd-sourcing wages, we were able to enlist crowd workers to effectively complete longer, more complex tasks with competence comparable to that of graduate students with more comprehensive, expert-level knowledge. Furthermore, more complex tasks require increased communication with the workers. As tasks become more complex, the employment relationship begins to become more akin to outsourcing than crowdsourcing. Through this investigation, we were able to stretch and explore the limits of crowdsourcing as a tool for solving complex problems. Public Library of Science 2015-09-18 /pmc/articles/PMC4575153/ /pubmed/26383029 http://dx.doi.org/10.1371/journal.pone.0134978 Text en © 2015 Staffelbach et al http://creativecommons.org/licenses/by/4.0/ This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are properly credited. |
spellingShingle | Research Article Staffelbach, Matthew Sempolinski, Peter Kijewski-Correa, Tracy Thain, Douglas Wei, Daniel Kareem, Ahsan Madey, Gregory Lessons Learned from Crowdsourcing Complex Engineering Tasks |
title | Lessons Learned from Crowdsourcing Complex Engineering Tasks |
title_full | Lessons Learned from Crowdsourcing Complex Engineering Tasks |
title_fullStr | Lessons Learned from Crowdsourcing Complex Engineering Tasks |
title_full_unstemmed | Lessons Learned from Crowdsourcing Complex Engineering Tasks |
title_short | Lessons Learned from Crowdsourcing Complex Engineering Tasks |
title_sort | lessons learned from crowdsourcing complex engineering tasks |
topic | Research Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4575153/ https://www.ncbi.nlm.nih.gov/pubmed/26383029 http://dx.doi.org/10.1371/journal.pone.0134978 |
work_keys_str_mv | AT staffelbachmatthew lessonslearnedfromcrowdsourcingcomplexengineeringtasks AT sempolinskipeter lessonslearnedfromcrowdsourcingcomplexengineeringtasks AT kijewskicorreatracy lessonslearnedfromcrowdsourcingcomplexengineeringtasks AT thaindouglas lessonslearnedfromcrowdsourcingcomplexengineeringtasks AT weidaniel lessonslearnedfromcrowdsourcingcomplexengineeringtasks AT kareemahsan lessonslearnedfromcrowdsourcingcomplexengineeringtasks AT madeygregory lessonslearnedfromcrowdsourcingcomplexengineeringtasks |