AAAI Publications, Third AAAI Conference on Human Computation and Crowdsourcing

Font Size: 
Using Anonymity and Communal Efforts to Improve Quality of Crowdsourced Feedback
Julie Hui, Amos Glenn, Rachel Jue, Elizabeth Gerber, Steven Dow

Last modified: 2015-09-23

Abstract


Student entrepreneurs struggle to collect feedback on their product pitches in a classroom setting due to a lack of time, money, and access to motivated feedback providers. Online social networks present a unique opportunity for entrepreneurial students to quickly access feedback providers by leveraging their online social capital. In order to better understand how to improve crowdsourced online pitch feedback, we perform an experiment to test the effect of online anonymity on pitch feedback quality and quantity. We also test a communal feedback method—evenly distributing between teams feedback providers from the class’s collective online social networks—which would help all teams benefit from a useful amount of feedback rather than having some teams receive much more feedback than others. We found that feedback providers in the anonymous condition provided significantly more specific criticism and specific praise, which students rated as more useful. Furthermore, we found that the communal feedback method helped all teams receive sufficient feedback to edit their pitches. This research contributes an empirical investigation to the crowdsourcing community of how crowds through online social networks can help student entrepreneurs obtain authentic feedback to improve their work.


Keywords


crowdsourcing; social media; social networks; feedback; classroom

Full Text: PDF