{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,12,5]],"date-time":"2025-12-05T12:10:18Z","timestamp":1764936618233,"version":"3.41.0"},"reference-count":30,"publisher":"Association for Computing Machinery (ACM)","issue":"2","license":[{"start":{"date-parts":[[2012,1,9]],"date-time":"2012-01-09T00:00:00Z","timestamp":1326067200000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/www.acm.org\/publications\/policies\/copyright_policy#Background"}],"content-domain":{"domain":["dl.acm.org"],"crossmark-restriction":true},"short-container-title":["SIGIR Forum"],"published-print":{"date-parts":[[2012,1,9]]},"abstract":"<jats:p>The 2nd SIGIR Workshop on Crowdsourcing for Information Retrieval (CIR 2011) was held on July 28, 2011 in Beijing, China, in conjunction with the 34th Annual ACM SIGIR Conference1. The workshop brought together researchers and practitioners to disseminate recent advances in theory, empirical methods, and novel applications of crowdsourcing for information retrieval (IR). The workshop program included three invited talks, a panel discussion entitled Beyond the Lab: State-of-the-Art and Open Challenges in Practical Crowdsourcing, and presentation of nine refereed research papers and one demonstration paper. A Best Paper Award, sponored by Microsoft Bing, was awarded to Jun Wang and Bei Yu for their paper entitled Labeling Images with Queries: A Recall-based Image Retrieval Game Approach. A Crowdsourcing Challenge contest was also announced prior to the workshop, sponsored by CrowdFlower. The contest offered both seed funding and advanced technical support for the winner to use CrowdFlower's services for innovative work. Workshop organizers selected Mark Smucker as the winner based on his proposal entitled: The Crowd vs. the Lab: A Comparison of Crowd-Sourced and University Laboratory Participant Behavior. Proceedings of the workshop are available online2 [15].<\/jats:p>","DOI":"10.1145\/2093346.2093356","type":"journal-article","created":{"date-parts":[[2012,1,10]],"date-time":"2012-01-10T17:01:25Z","timestamp":1326214885000},"page":"66-75","update-policy":"https:\/\/doi.org\/10.1145\/crossmark-policy","source":"Crossref","is-referenced-by-count":38,"title":["Crowdsourcing for information retrieval"],"prefix":"10.1145","volume":"45","author":[{"given":"Matthew","family":"Lease","sequence":"first","affiliation":[{"name":"University of Texas, Austin, TX"}]},{"given":"Emine","family":"Yilmaz","sequence":"additional","affiliation":[{"name":"Microsoft Research Cambridge, UK and Koc University Istanbul, Turkey"}]}],"member":"320","published-online":{"date-parts":[[2012,1,9]]},"reference":[{"key":"e_1_2_1_1_1","volume-title":"Tutorial: Crowdsourcing for Relevance Evaluation. In Proceedings of the 32nd European Conference on IR Research (ECIR), 2010","author":"Alonso O.","year":"2010","unstructured":"O. Alonso . Tutorial: Crowdsourcing for Relevance Evaluation. In Proceedings of the 32nd European Conference on IR Research (ECIR), 2010 . Slides available online at http:\/\/ir.ischool.utexas.edu\/cse 2010 \/materials\/alonso-ecir2010-tutorial.pdf. O. Alonso. Tutorial: Crowdsourcing for Relevance Evaluation. In Proceedings of the 32nd European Conference on IR Research (ECIR), 2010. Slides available online at http:\/\/ir.ischool.utexas.edu\/cse2010\/materials\/alonso-ecir2010-tutorial.pdf."},{"key":"e_1_2_1_2_1","doi-asserted-by":"publisher","DOI":"10.1145\/1935826.1935831"},{"key":"e_1_2_1_3_1","doi-asserted-by":"publisher","DOI":"10.1145\/2009916.2010170"},{"key":"e_1_2_1_4_1","first-page":"15","volume-title":"Proceedings of the SIGIR 2009 Workshop on the Future of IR Evaluation","author":"Alonso O.","year":"2009","unstructured":"O. Alonso and S. Mizzaro . Can we get rid of TREC assessors? Using Mechanical Turk for relevance assessment . In Proceedings of the SIGIR 2009 Workshop on the Future of IR Evaluation , pages 15 -- 16 , 2009 . O. Alonso and S. Mizzaro. Can we get rid of TREC assessors? Using Mechanical Turk for relevance assessment. In Proceedings of the SIGIR 2009 Workshop on the Future of IR Evaluation, pages 15--16, 2009."},{"key":"e_1_2_1_5_1","doi-asserted-by":"publisher","DOI":"10.1145\/1148170.1148263"},{"key":"e_1_2_1_6_1","doi-asserted-by":"publisher","DOI":"10.1145\/1148170.1148219"},{"key":"e_1_2_1_7_1","doi-asserted-by":"publisher","DOI":"10.1145\/1924475.1924481"},{"key":"e_1_2_1_8_1","doi-asserted-by":"publisher","DOI":"10.5555\/275537.275544"},{"key":"e_1_2_1_9_1","volume-title":"Vries. GEAnn - Games for Engaging Annotations. In Proceedings of the ACM SIGIR Workshop on Crowdsourcing for Information Retrieval","author":"Eickhoff C.","year":"2011","unstructured":"C. Eickhoff , C. G. Harris , P. Srinivasan , and A. P. de Vries. GEAnn - Games for Engaging Annotations. In Proceedings of the ACM SIGIR Workshop on Crowdsourcing for Information Retrieval , 2011 . C. Eickhoff, C. G. Harris, P. Srinivasan, and A. P. de Vries. GEAnn - Games for Engaging Annotations. In Proceedings of the ACM SIGIR Workshop on Crowdsourcing for Information Retrieval, 2011."},{"key":"e_1_2_1_10_1","volume-title":"Technical Report CMU-LTI-017","author":"Elsas J. L.","year":"2011","unstructured":"J. L. Elsas . Ancestry.com online forum test collection. Technical Report CMU-LTI-017 , Language Technologies Institute, School of Computer Science, Carnegie Mellon University , 2011 . J. L. Elsas. Ancestry.com online forum test collection. Technical Report CMU-LTI-017, Language Technologies Institute, School of Computer Science, Carnegie Mellon University, 2011."},{"key":"e_1_2_1_11_1","doi-asserted-by":"publisher","DOI":"10.1109\/MIS.2009.36"},{"key":"e_1_2_1_12_1","volume-title":"Proceedings of the ACM SIGIR 2010 Workshop on Crowdsourcing for Search Evaluation (CSE 2010","author":"Lease M.","year":"2010","unstructured":"M. Lease , V. Carvalho , and E. Yilmaz , editors . Proceedings of the ACM SIGIR 2010 Workshop on Crowdsourcing for Search Evaluation (CSE 2010 ). Geneva, Switzerland , July 2010 . Available online at http:\/\/ir.ischool.utexas.edu\/cse2010. M. Lease, V. Carvalho, and E. Yilmaz, editors. Proceedings of the ACM SIGIR 2010 Workshop on Crowdsourcing for Search Evaluation (CSE 2010). Geneva, Switzerland, July 2010. Available online at http:\/\/ir.ischool.utexas.edu\/cse2010."},{"key":"e_1_2_1_13_1","doi-asserted-by":"publisher","DOI":"10.1145\/1988852.1988856"},{"key":"e_1_2_1_14_1","volume-title":"Proceedings of the Workshop on Crowdsourcing for Search and Data Mining (CSDM) at the Fourth ACM International Conference on Web Search and Data Mining (WSDM)","author":"Lease M.","year":"2011","unstructured":"M. Lease , V. Carvalho , and E. Yilmaz , editors . Proceedings of the Workshop on Crowdsourcing for Search and Data Mining (CSDM) at the Fourth ACM International Conference on Web Search and Data Mining (WSDM) . Hong Kong, China , February 2011 . Available online at http:\/\/ir.ischool.utexas.edu\/cse2010\/program.html. M. Lease, V. Carvalho, and E. Yilmaz, editors. Proceedings of the Workshop on Crowdsourcing for Search and Data Mining (CSDM) at the Fourth ACM International Conference on Web Search and Data Mining (WSDM). Hong Kong, China, February 2011. Available online at http:\/\/ir.ischool.utexas.edu\/cse2010\/program.html."},{"key":"e_1_2_1_15_1","doi-asserted-by":"publisher","DOI":"10.1145\/2093346.2093356"},{"key":"e_1_2_1_16_1","doi-asserted-by":"publisher","DOI":"10.1561\/1500000016"},{"key":"e_1_2_1_17_1","volume-title":"Proceedings of the ACM SIGIR Workshop on Crowdsourcing for Information Retrieval","author":"Schone P.","year":"2011","unstructured":"P. Schone and M. Jones . Genealogical Search Analysis Using Crowd Sourcing . In Proceedings of the ACM SIGIR Workshop on Crowdsourcing for Information Retrieval , 2011 . P. Schone and M. Jones. Genealogical Search Analysis Using Crowd Sourcing. In Proceedings of the ACM SIGIR Workshop on Crowdsourcing for Information Retrieval, 2011."},{"key":"e_1_2_1_18_1","first-page":"14","author":"Schwartz B.","year":"2008","unstructured":"B. Schwartz . The Google Quality Raters Handbook , 2008 . March 14 . http:\/\/searchengineland.com\/the-google-quality-raters-handbook-13575. B. Schwartz. The Google Quality Raters Handbook, 2008. March 14. http:\/\/searchengineland.com\/the-google-quality-raters-handbook-13575.","journal-title":"The Google Quality Raters Handbook"},{"key":"e_1_2_1_19_1","volume-title":"Proceedings of the ACM SIGIR Workshop on Crowdsourcing for Information Retrieval","author":"Smucker M.","year":"2011","unstructured":"M. Smucker and C. P. Jethani . The Crowd vs. the Lab: A Comparison of Crowd-Sourced and University Laboratory Participant Behavior . In Proceedings of the ACM SIGIR Workshop on Crowdsourcing for Information Retrieval , 2011 . M. Smucker and C. P. Jethani. The Crowd vs. the Lab: A Comparison of Crowd-Sourced and University Laboratory Participant Behavior. In Proceedings of the ACM SIGIR Workshop on Crowdsourcing for Information Retrieval, 2011."},{"key":"e_1_2_1_20_1","volume-title":"Proceedings of the ACM SIGIR Workshop on Crowdsourcing for Information Retrieval","author":"Stone M.","year":"2011","unstructured":"M. Stone , K. Kim , S. Myagmar , and O. Alonso . A Comparison of On-Demand Workforce with Trained Judges for Web Search Relevance Evaluation . In Proceedings of the ACM SIGIR Workshop on Crowdsourcing for Information Retrieval , 2011 . M. Stone, K. Kim, S. Myagmar, and O. Alonso. A Comparison of On-Demand Workforce with Trained Judges for Web Search Relevance Evaluation. In Proceedings of the ACM SIGIR Workshop on Crowdsourcing for Information Retrieval, 2011."},{"key":"e_1_2_1_21_1","volume-title":"Proceedings of the ACM SIGIR Workshop on Crowdsourcing for Information Retrieval","author":"Su Q.","year":"2011","unstructured":"Q. Su . An Ensemble Framework for Predicting Best Community Answers . In Proceedings of the ACM SIGIR Workshop on Crowdsourcing for Information Retrieval , 2011 . Q. Su. An Ensemble Framework for Predicting Best Community Answers. In Proceedings of the ACM SIGIR Workshop on Crowdsourcing for Information Retrieval, 2011."},{"key":"e_1_2_1_22_1","volume-title":"Proceedings of the ACM SIGIR Workshop on Crowdsourcing for Information Retrieval","author":"Tai L.","year":"2011","unstructured":"L. Tai , Z. Chuang , X. Tao , W. Ming , and X. Jingjing . Quality Control of Crowdsourcing through Workers Experience . In Proceedings of the ACM SIGIR Workshop on Crowdsourcing for Information Retrieval , 2011 . L. Tai, Z. Chuang, X. Tao, W. Ming, and X. Jingjing. Quality Control of Crowdsourcing through Workers Experience. In Proceedings of the ACM SIGIR Workshop on Crowdsourcing for Information Retrieval, 2011."},{"key":"e_1_2_1_23_1","volume-title":"Proceedings of the ACM SIGIR Workshop on Crowdsourcing for Information Retrieval","author":"Tang W.","year":"2011","unstructured":"W. Tang and M. Lease . Semi-Supervised Consensus Labeling for Crowdsourcing . In Proceedings of the ACM SIGIR Workshop on Crowdsourcing for Information Retrieval , 2011 . W. Tang and M. Lease. Semi-Supervised Consensus Labeling for Crowdsourcing. In Proceedings of the ACM SIGIR Workshop on Crowdsourcing for Information Retrieval, 2011."},{"key":"e_1_2_1_24_1","volume-title":"Proceedings of the ACM SIGIR Workshop on Crowdsourcing for Information Retrieval","author":"Vallet D.","year":"2011","unstructured":"D. Vallet . Crowdsourced Evaluation of Personalization and Diversification Techniques in Web Search . In Proceedings of the ACM SIGIR Workshop on Crowdsourcing for Information Retrieval , 2011 . D. Vallet. Crowdsourced Evaluation of Personalization and Diversification Techniques in Web Search. In Proceedings of the ACM SIGIR Workshop on Crowdsourcing for Information Retrieval, 2011."},{"key":"e_1_2_1_25_1","doi-asserted-by":"publisher","DOI":"10.1145\/985692.985733"},{"key":"e_1_2_1_26_1","volume-title":"Proceedings of the ACM SIGIR Workshop on Crowdsourcing for Information Retrieval","author":"Vuurens J.","year":"2011","unstructured":"J. Vuurens , A. P. D. Vries , and C. Eickhoff . How Much Spam Can You Take? An Analysis of Crowdsourcing Results to Increase Accuracy . In Proceedings of the ACM SIGIR Workshop on Crowdsourcing for Information Retrieval , 2011 . J. Vuurens, A. P. D. Vries, and C. Eickhoff. How Much Spam Can You Take? An Analysis of Crowdsourcing Results to Increase Accuracy. In Proceedings of the ACM SIGIR Workshop on Crowdsourcing for Information Retrieval, 2011."},{"key":"e_1_2_1_27_1","volume-title":"Proceedings of the ACM SIGIR Workshop on Crowdsourcing for Information Retrieval","author":"Wang J.","year":"2011","unstructured":"J. Wang and B. Yu . Labeling Images with Queries: A Recall-based Image Retrieval Game Approach . In Proceedings of the ACM SIGIR Workshop on Crowdsourcing for Information Retrieval , 2011 . J. Wang and B. Yu. Labeling Images with Queries: A Recall-based Image Retrieval Game Approach. In Proceedings of the ACM SIGIR Workshop on Crowdsourcing for Information Retrieval, 2011."},{"key":"e_1_2_1_28_1","doi-asserted-by":"publisher","DOI":"10.1002\/meet.2011.14504801135"},{"key":"e_1_2_1_29_1","doi-asserted-by":"publisher","DOI":"10.1145\/1814433.1814443"},{"key":"e_1_2_1_30_1","doi-asserted-by":"publisher","DOI":"10.1145\/1390334.1390437"}],"container-title":["ACM SIGIR Forum"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/2093346.2093356","content-type":"unspecified","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/dl.acm.org\/doi\/pdf\/10.1145\/2093346.2093356","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,6,18]],"date-time":"2025-06-18T09:48:46Z","timestamp":1750240126000},"score":1,"resource":{"primary":{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/2093346.2093356"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2012,1,9]]},"references-count":30,"journal-issue":{"issue":"2","published-print":{"date-parts":[[2012,1,9]]}},"alternative-id":["10.1145\/2093346.2093356"],"URL":"https:\/\/doi.org\/10.1145\/2093346.2093356","relation":{},"ISSN":["0163-5840"],"issn-type":[{"type":"print","value":"0163-5840"}],"subject":[],"published":{"date-parts":[[2012,1,9]]},"assertion":[{"value":"2012-01-09","order":2,"name":"published","label":"Published","group":{"name":"publication_history","label":"Publication History"}}]}}