{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,20]],"date-time":"2026-03-20T17:06:09Z","timestamp":1774026369561,"version":"3.50.1"},"reference-count":34,"publisher":"Wiley","issue":"1","license":[{"start":{"date-parts":[[2013,1,24]],"date-time":"2013-01-24T00:00:00Z","timestamp":1358985600000},"content-version":"vor","delay-in-days":389,"URL":"http:\/\/onlinelibrary.wiley.com\/termsAndConditions#vor"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Proc of Assoc for Info"],"published-print":{"date-parts":[[2012,1]]},"abstract":"<jats:title>Abstract<\/jats:title><jats:p>While usability evaluation is critical to designing usable websites, traditional usability testing can be both expensive and time consuming. The advent of crowdsourcing platforms such as Amazon Mechanical Turk and CrowdFlower offer an intriguing new avenue for performing remote usability testing with potentially many users, quick turn\u2010around, and significant cost savings. To investigate the potential of such crowdsourced usability testing, we conducted a usability study which evaluated a graduate school's website using a crowdsourcing platform. In addition, we performed a similar but not identical traditional lab usability test on the same site. While we find that crowdsourcing exhibits some notable limitations in comparison to the traditional lab environment, its applicability and value for usability testing is clearly evidenced. We discuss both methodological differences for crowdsourced usability testing, as well as empirical contrasts to results from more traditional, face\u2010to\u2010face usability testing.<\/jats:p>","DOI":"10.1002\/meet.14504901100","type":"journal-article","created":{"date-parts":[[2013,1,24]],"date-time":"2013-01-24T10:49:23Z","timestamp":1359024563000},"page":"1-10","source":"Crossref","is-referenced-by-count":53,"title":["Crowdsourcing for usability testing"],"prefix":"10.1002","volume":"49","author":[{"given":"Di","family":"Liu","sequence":"first","affiliation":[]},{"given":"Randolph G.","family":"Bias","sequence":"additional","affiliation":[]},{"given":"Matthew","family":"Lease","sequence":"additional","affiliation":[]},{"given":"Rebecca","family":"Kuipers","sequence":"additional","affiliation":[]}],"member":"311","published-online":{"date-parts":[[2013,1,24]]},"reference":[{"key":"e_1_2_9_2_1","unstructured":"Amazon2011.Amazon Mechanical Turk.http:\/\/aws.amazon.com\/mturk\/#pricing. Visited April 2012."},{"key":"e_1_2_9_3_1","doi-asserted-by":"crossref","unstructured":"Bailey G. D.1993.Iterative methodology and designer training in human\u2010computer interface design INTERCHI '93 198\u2013205.","DOI":"10.1145\/169059.169163"},{"key":"e_1_2_9_4_1","unstructured":"Bias R. G. &Huang S. C.2010.Remote remote remote remote usability testing.Proceedings of the Society for Technical Communication Summit May Dallas."},{"key":"e_1_2_9_5_1","volume-title":"Cost\u2010justifying usability, 2nd edition: Update for the Internet age","author":"Bias R. G.","year":"2005"},{"key":"e_1_2_9_6_1","article-title":"Online labour markets: an inquiry into oDesk providers","volume":"4","author":"Caraway B.","year":"2010","journal-title":"Work Organizations, Labour and Globalisation"},{"key":"e_1_2_9_7_1","unstructured":"Chen D. &Dolan W.2011.Collecting Highly Parallel Data for Paraphrase Evaluation. InProc. of the Association for Computational Linguistics (ACL)."},{"key":"e_1_2_9_8_1","unstructured":"CrowdFlower2011.FAQ \u2014 Self\u2010Service \u2014 CrowdFlower.http:\/\/api.crowdflower.com\/solutions\/self\u2010service. Visited April 2012."},{"key":"e_1_2_9_9_1","doi-asserted-by":"crossref","unstructured":"Downs J. S. Holbrook M. B. Sheng S. &Cranor L. F.2010Are your participants gaming the system? Screening Mechanical Turk Workers. In CHI '10: Proceedings of the 28th international conference on Human factors in computing systems ACM pp.2399\u20132402. 2010","DOI":"10.1145\/1753326.1753688"},{"key":"e_1_2_9_10_1","unstructured":"Frei B. 2009\u201cPaid crowdsourcing: Current state & progress toward mainstream business use \u201dWhitepaper Produced bySmartsheet.com 2009."},{"key":"e_1_2_9_11_1","doi-asserted-by":"crossref","unstructured":"Heymann P.&Garcia\u2010Molina H.2011.Turkalytics: analytics for human computation.Proceedings of the 20th international conference on World wide web (WWW) pp.477\u2013486 2011.","DOI":"10.1145\/1963405.1963473"},{"key":"e_1_2_9_12_1","doi-asserted-by":"crossref","unstructured":"Rzeszotarski J.M.&Kittur A.2011.Instrumenting the crowd: using implicit behavioral measures to predict task performance.Proceedings of the 24th annual ACM symposium on User interface software and technology (UIST) 13\u201022 2011.","DOI":"10.1145\/2047196.2047199"},{"key":"e_1_2_9_13_1","unstructured":"Howe J.2006.The Rise of Crowdsourcing."},{"key":"e_1_2_9_14_1","unstructured":"Howe J.2008 Crowdsourcing: A Definition.http:\/\/crowdsourcing.typepad.com. Visited April 2012."},{"key":"e_1_2_9_15_1","unstructured":"Howe J.2008.Crowdsourcing: Now with a real business model!http:\/\/www.wired.com\/epicenter\/2008\/12\/crowdsourcing\u2010n. Visited April 2012."},{"key":"e_1_2_9_16_1","volume-title":"CeDER\u201010\u201001 working paper","author":"Ipeirotis P.","year":"2010"},{"key":"e_1_2_9_17_1","unstructured":"Ipeirotis P.20101Be a Top Mechanical Turk Worker: You Need $5 and 5 Minutes.http:\/\/behind\u2010the\u2010enemy\u2010lines.blogspot.com\/2010\/10\/be\u2010top\u2010mechanical\u2010turk\u2010worker\u2010you\u2010need.html. Visited April 2012."},{"key":"e_1_2_9_18_1","doi-asserted-by":"crossref","unstructured":"Ipeirotis P. Provost F. &Wang J.2010.Quality Management on Amazon Mechanical Turk. InHCOMP '10 New York NY USA 2010.","DOI":"10.1145\/1837885.1837906"},{"key":"e_1_2_9_19_1","unstructured":"ISO 9241\u201011 Ergonomic requirements for office work with visual display teminals (VDT)s \u2014 Part 11 Guidance on Usability International Standard 1998"},{"key":"e_1_2_9_20_1","unstructured":"Kapelner A. &Chandler D.2010.Preventing Satisficing in Online Surveys: A \u201cKapcha\u201d to Ensure Higher Quality Data. InCrowdConf Proceedings 2010."},{"key":"e_1_2_9_21_1","doi-asserted-by":"crossref","unstructured":"Kittur A. Chi E. H. &Suh B.2008.Crowdsourcing user studies with Mechanical Turk. In theProceedings of the ACM CHI Conference.","DOI":"10.1145\/1357054.1357127"},{"key":"e_1_2_9_22_1","doi-asserted-by":"crossref","unstructured":"Kochhar S. Mazzocchi S. &ParitoshP.2010The anatomy of a large\u2010scale human computation engine. In HCOMP '10 New York NY USA 2010.","DOI":"10.1145\/1837885.1837890"},{"key":"e_1_2_9_23_1","doi-asserted-by":"publisher","DOI":"10.1145\/1809400.1809422"},{"key":"e_1_2_9_24_1","unstructured":"Mason W.andSuri S.2010.A Guide to Conducting Behavioral Research on Amazon's Mechanical Turk. InSocial Science Research Network pre\u2010print. 2010."},{"key":"e_1_2_9_25_1","doi-asserted-by":"publisher","DOI":"10.1080\/01449290600959062"},{"key":"e_1_2_9_26_1","unstructured":"Nielsen J.2003.Usability 101: Introduction to Usability.http:\/\/www.useit.com\/alertbox\/20030825.html. Visited April 2012."},{"key":"e_1_2_9_27_1","doi-asserted-by":"publisher","DOI":"10.1145\/259963.260531"},{"key":"e_1_2_9_28_1","unstructured":"Quinn A. J. &Bederson B. B. 2011Human Computation: A Survey and Taxonomy of a Growing Field. InProceedings of CHI2011 May 7\u201012 2011 Vancouver BC Canada."},{"key":"e_1_2_9_29_1","doi-asserted-by":"crossref","unstructured":"Ross J. Irani I. Silberman M. Six Zaldivar A. &Tomlinson B.2010. \u201cWho are the Crowdworkers?: Shifting Demographics in Amazon Mechanical Turk\u201d. In:CHI EA 2010. (2863\u20132872).","DOI":"10.1145\/1753846.1753873"},{"key":"e_1_2_9_30_1","volume-title":"Handbook of usability testing: How to plan, design, and conduct effective tests","author":"Rubin J.","year":"2008"},{"key":"e_1_2_9_31_1","doi-asserted-by":"crossref","unstructured":"Spool J. &Schroeder W.2001.Test web sites: five users is nowhere enough. InProc. CHI2001 Extended Abstract. Seattle: ACM Press.285\u2013286","DOI":"10.1145\/634067.634236"},{"key":"e_1_2_9_32_1","unstructured":"uTest2011a. About Us.http:\/\/www.utest.com\/company\/. Visited April 2012."},{"key":"e_1_2_9_33_1","unstructured":"uTest2011b.How It Works.http:\/\/www.utest.com\/how\u2010it\u2010works\/agile\u2010testing\/. Visited April 2012."},{"key":"e_1_2_9_34_1","unstructured":"uTest2011c.Pricing Information.http:\/\/www.utest.com\/pricing. Visited April 2012."},{"key":"e_1_2_9_35_1","unstructured":"Winsor J.2009.Crowdsourcing: What it means for innovation.http:\/\/www.businessweek.com\/innovate\/content\/jun2009\/id20090615_946326.htmlVisited April 2012."}],"container-title":["Proceedings of the American Society for Information Science and Technology"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/api.wiley.com\/onlinelibrary\/tdm\/v1\/articles\/10.1002%2Fmeet.14504901100","content-type":"unspecified","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/api.wiley.com\/onlinelibrary\/tdm\/v1\/articles\/10.1002%2Fmeet.14504901100","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/asistdl.onlinelibrary.wiley.com\/doi\/pdf\/10.1002\/meet.14504901100","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,20]],"date-time":"2025-10-20T11:36:53Z","timestamp":1760960213000},"score":1,"resource":{"primary":{"URL":"https:\/\/asistdl.onlinelibrary.wiley.com\/doi\/10.1002\/meet.14504901100"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2012,1]]},"references-count":34,"journal-issue":{"issue":"1","published-print":{"date-parts":[[2012,1]]}},"alternative-id":["10.1002\/meet.14504901100"],"URL":"https:\/\/doi.org\/10.1002\/meet.14504901100","archive":["Portico"],"relation":{},"ISSN":["0044-7870","1550-8390"],"issn-type":[{"value":"0044-7870","type":"print"},{"value":"1550-8390","type":"electronic"}],"subject":[],"published":{"date-parts":[[2012,1]]}}}