Wisdom of the Crowd
Papers from the 2012 AAAI Spring Symposium
Caroline Pantofaru, Sonia Chernova, Alex Sorokin, Program Cochairs
Technical Report SS-12-06
88 pp., $30.00
ISBN 978-1-57735-555-7
[Add to Cart] [View Cart]
Crowd-sourcing provides a convenient and increasingly popular method for gathering large amounts of data and annotations. Amazon's Mechanical Turk and CrowdFlower, games such as the ESP Game, and requests for free annotation help such as LabelMe are just a few examples of crowd-sourcing efforts. These attempts have taught us many lessons and brought up yet more questions. How can we most effectively elicit the information we need from a distant and potentially anonymous workforce? What kind of workforce is required for different tasks such as user studies and data set labeling? How can we train and evaluate workers?
This symposium brought together researchers from robotics, user interfaces, games, computer vision, and other disciplines exploring the core scientific research challenges of crowd-sourcing, to work toward formulating a set of guidelines for future crowd-sourcing endeavors.