AAAI Publications, First AAAI Conference on Human Computation and Crowdsourcing

Font Size: 
Crowdsourcing Objective Answers to Subjective Questions Online
Ravi Iyer

Last modified: 2013-11-03

Abstract


In this demonstration, we show how Ranker’s algorithms use diverse sampling, measurement, and algorithmic techniques to crowdsource answers to subjective questions in a real-world online environment where user behavior is difficult to control. Ranker receives approximately 8 million visitors each month, as of September 2013, and collects over 1.5 million monthly user opinions. Tradeoffs between computational complexity, projected user engagement, and accuracy are required in such an environment, and aggregating across diverse techniques allows us to mitigate the sizable errors specific to individual imperfect crowdsourcing methods. We will specifically show how relatively unstructured crowdsourcing can yield surprisingly accurate predictions of movie box-office revenue, celebrity mortality, and retail pizza topping sales.

Full Text: PDF