AAAI Publications, Third AAAI Conference on Human Computation and Crowdsourcing

Font Size: 
Crowdsourcing from Scratch: A Pragmatic Experiment in Data Collection by Novice Requesters
Alexandra Papoutsaki, Hua Guo, Danae Metaxa-Kakavouli, Connor Gramazio, Jeff Rasley, Wenting Xie, Guan Wang, Jeff Huang

Last modified: 2015-09-23


As crowdsourcing has gained prominence in recent years, an increasing number of people turn to popular crowdsourcing platforms for their many uses. Experienced members of the crowdsourcing community have developed numerous systems both separately and in conjunction with these platforms, along with other tools and design techniques, to gain more specialized functionality and overcome various shortcomings. It is unclear, however, how novice requesters using crowdsourcing platforms for general tasks experience existing platforms and how, if at all, their approaches deviate from the best practices established by the crowdsourcing research community. We conduct an experiment with a class of 19 students to study how novice requesters design crowdsourcing tasks. Each student tried their hand at crowdsourcing a real data collection task with a fixed budget and realistic time constraint. Students used Amazon Mechanical Turk to gather information about the academic careers of over 2,000 professors from 50 top Computer Science departments in the U.S. In addition to curating this dataset, we classify the strategies which emerged, discuss design choices students made on task dimensions, and compare these novice strategies to best practices identified in crowdsourcing literature. Finally, we summarize design pitfalls and effective strategies observed to provide guidelines for novice requesters.


crowdsourcing; Amazon Mechanical Turk; novice requesters

Full Text: PDF