AAAI Publications, Second AAAI Conference on Human Computation and Crowdsourcing

Font Size: 
Naturalistic Distributed Experimentation as a Source of New Insight
Sandy J. J. Gould, Anna L. Cox, Duncan P. Brumby

Last modified: 2014-10-14

Abstract


Human performance experiments are often conducted online with the help of paid crowdworkers and citizen scientists. This approach produces reliable data, but there are concerns that the inevitable loss of control that accompanies online experimentation might confound results. Researchers have therefore spent time considering how to regain control and mitigate the effects of confounds. In this abstract we argue that confounding factors in online work can be put to novel use, giving us insight into research questions we might otherwise be unable to answer.

Keywords


Online experimentation; crowdsourcing; citizen science; confounding variables

References


Böhmer, M., Lander, C., Gehring, S., Brumby, D. P., & Krüger, A. (2014). Interrupted by a Phone Call: Exploring Designs for Lowering the Impact of Call Notifications for Smartphone Users. In Proceedings of the 32Nd Annual ACM Conference on Human Factors in Computing Systems (pp. 3045–3054). New York, NY, USA: ACM. doi:10.1145/2556288.2557066

Gould, S. J. J., Cox, A. L., & Brumby, D. P. (2013). Frequency and Duration of Self-initiated Task-switching in an Online Investigation of Interrupted Performance. In Human Computation and Crowdsourcing: Works in Progress and Demonstration Abstracts AAAI Technical Report CR-13-01 (pp. 22–23). Retrieved from http://www.aaai.org/ocs/index.php/HCOMP/HCOMP13/paper/view/7485

Kapelner, A., & Chandler, D. (2010). Preventing Satisficing in online surveys. In CrowdConf.

Kittur, A., Chi, E. H., & Suh, B. (2008). Crowdsourcing user studies with Mechanical Turk. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 453–456). New York, NY, USA: ACM. doi:10.1145/1357054.1357127

Komarov, S., Reinecke, K., & Gajos, K. Z. (2013). Crowdsourcing Performance Evaluations of User Interfaces. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 207–216). New York, NY, USA: ACM. doi:10.1145/2470654.2470684

Mao, A., Kamar, E., & Horvitz, E. (2013). Why Stop Now? Predicting Worker Engagement in Online Crowdsourcing. In First AAAI Conference on Human Computation and Crowdsourcing (pp. 103–111). AAAI. Retrieved from http://www.aaai.org/ocs/index.php/HCOMP/HCOMP13/paper/view/7498

Mason, W., & Watts, D. J. (2010). Financial incentives and the “performance of crowds.” SIGKDD Explor. Newsl., 11(2), 100–108. doi:10.1145/1809400.1809422


Full Text: PDF