{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,4,14]],"date-time":"2026-04-14T00:54:25Z","timestamp":1776128065515,"version":"3.50.1"},"reference-count":101,"publisher":"Association for Computing Machinery (ACM)","issue":"CSCW2","license":[{"start":{"date-parts":[[2021,10,18]],"date-time":"2021-10-18T00:00:00Z","timestamp":1634515200000},"content-version":"vor","delay-in-days":5,"URL":"http:\/\/www.acm.org\/publications\/policies\/copyright_policy#Background"}],"funder":[{"DOI":"10.13039\/100000001","name":"National Science Foundation","doi-asserted-by":"publisher","award":["IIS-2040942"],"award-info":[{"award-number":["IIS-2040942"]}],"id":[{"id":"10.13039\/100000001","id-type":"DOI","asserted-by":"publisher"}]}],"content-domain":{"domain":["dl.acm.org"],"crossmark-restriction":true},"short-container-title":["Proc. ACM Hum.-Comput. Interact."],"published-print":{"date-parts":[[2021,10,13]]},"abstract":"<jats:p>A growing body of literature has proposed formal approaches to audit algorithmic systems for biased and harmful behaviors. While formal auditing approaches have been greatly impactful, they often suffer major blindspots, with critical issues surfacing only in the context of everyday use once systems are deployed. Recent years have seen many cases in which everyday users of algorithmic systems detect and raise awareness about harmful behaviors that they encounter in the course of their everyday interactions with these systems. However, to date little academic attention has been granted to these bottom-up, user-driven auditing processes. In this paper, we propose and explore the concept of everyday algorithm auditing, a process in which users detect, understand, and interrogate problematic machine behaviors via their day-to-day interactions with algorithmic systems. We argue that everyday users are powerful in surfacing problematic machine behaviors that may elude detection via more centrally-organized forms of auditing, regardless of users' knowledge about the underlying algorithms. We analyze several real-world cases of everyday algorithm auditing, drawing lessons from these cases for the design of future platforms and tools that facilitate such auditing behaviors. Finally, we discuss work that lies ahead, toward bridging the gaps between formal auditing approaches and the organic auditing behaviors that emerge in everyday use of algorithmic systems.<\/jats:p>","DOI":"10.1145\/3479577","type":"journal-article","created":{"date-parts":[[2021,10,19]],"date-time":"2021-10-19T02:46:19Z","timestamp":1634611579000},"page":"1-29","update-policy":"https:\/\/doi.org\/10.1145\/crossmark-policy","source":"Crossref","is-referenced-by-count":136,"title":["Everyday Algorithm Auditing: Understanding the Power of Everyday Users in Surfacing Harmful Algorithmic Behaviors"],"prefix":"10.1145","volume":"5","author":[{"given":"Hong","family":"Shen","sequence":"first","affiliation":[{"name":"Carnegie Mellon University, Pittsburgh, PA, USA"}]},{"given":"Alicia","family":"DeVos","sequence":"additional","affiliation":[{"name":"Carnegie Mellon University, Pittsburgh, PA, USA"}]},{"given":"Motahhare","family":"Eslami","sequence":"additional","affiliation":[{"name":"Carnegie Mellon University, Pittsburgh, PA, USA"}]},{"given":"Kenneth","family":"Holstein","sequence":"additional","affiliation":[{"name":"Carnegie Mellon University, Pittsburgh, PA, USA"}]}],"member":"320","published-online":{"date-parts":[[2021,10,18]]},"reference":[{"key":"e_1_2_1_1_1","volume-title":"Twitter investigates racial bias in image previews. BBC News (Sep","year":"2020","unstructured":"2020. Twitter investigates racial bias in image previews. BBC News (Sep 2020). https:\/\/www.bbc.com\/news\/technology-54234822"},{"key":"e_1_2_1_2_1","unstructured":"2021. The Algorithmic Justice League: Mission Team and Story. https:\/\/www.ajl.org\/about"},{"key":"e_1_2_1_3_1","volume-title":"Transparency around image cropping and changes to come. Twitter","author":"Agrawal Parag","year":"2020","unstructured":"Parag Agrawal and Dantley Davis. 2020. Transparency around image cropping and changes to come. Twitter (2020). https:\/\/blog.twitter.com\/en_us\/topics\/product\/2020\/transparency-image-cropping.html."},{"key":"e_1_2_1_4_1","doi-asserted-by":"publisher","DOI":"10.1111\/j.1468-0297.2012.02512.x"},{"key":"e_1_2_1_5_1","doi-asserted-by":"publisher","DOI":"10.1609\/icwsm.v14i1.7276"},{"key":"e_1_2_1_6_1","doi-asserted-by":"publisher","DOI":"10.1609\/aaai.v32i1.11493"},{"key":"e_1_2_1_7_1","volume-title":"Amazon changes its key formula for calculating product ratings and displaying reviews. GeekWire (June","author":"Bishop Todd","year":"2015","unstructured":"Todd Bishop. 2015. Amazon changes its key formula for calculating product ratings and displaying reviews. GeekWire (June 2015). https:\/\/www.geekwire.com\/2015\/amazon-changes-its-influential-formula-for-calculating-productratings"},{"key":"e_1_2_1_8_1","volume-title":"Hal Daum\u00e9 III, and Hanna Wallach","author":"Blodgett Su Lin","year":"2020","unstructured":"Su Lin Blodgett, Solon Barocas, Hal Daum\u00e9 III, and Hanna Wallach. 2020. Language (technology) is power: A critical survey of\" bias\" in nlp. arXiv preprint arXiv:2005.14050 (2020)."},{"key":"e_1_2_1_9_1","unstructured":"Rucksack Brian. 2019. Major changes to review scoring method on Booking com. https:\/\/hostelmanagement.com\/forums\/major-changes-review-scoring-method-booking-com.html"},{"key":"e_1_2_1_10_1","volume-title":"Twitter users say the platform crops out Black faces. CBS News (Sep","author":"Brooks Khristopher J","year":"2020","unstructured":"Khristopher J Brooks. 2020. Twitter users say the platform crops out Black faces. CBS News (Sep 2020). https:\/\/www.cbsnews.com\/news\/twitter-image-cropping-algorithm-racial-profiling\/"},{"key":"e_1_2_1_11_1","volume-title":"How I'm fighting bias in algorithms. November","author":"Buolamwini Joy","year":"2016","unstructured":"Joy Buolamwini. 2016. How I'm fighting bias in algorithms. November, TEDx Beacon Street)[Video File] (2016)."},{"key":"e_1_2_1_12_1","volume-title":"Proceedings of the 1st Conference on Fairness, Accountability and Transparency PMLR. 77--91","author":"Buolamwini Joy","year":"2018","unstructured":"Joy Buolamwini and Timnit Gebru. 2018. Gender shades: Intersectional accuracy disparities in commercial gender classification. In Proceedings of the 1st Conference on Fairness, Accountability and Transparency PMLR. 77--91."},{"key":"e_1_2_1_13_1","doi-asserted-by":"publisher","DOI":"10.1145\/2818048.2820023"},{"key":"e_1_2_1_14_1","doi-asserted-by":"publisher","DOI":"10.1145\/3173574.3174225"},{"key":"e_1_2_1_15_1","doi-asserted-by":"publisher","DOI":"10.1145\/3415185"},{"key":"e_1_2_1_16_1","volume-title":"Introducing Twitter's first algorithmic bias bounty challenge. Twitter","author":"Chowdhury Rumman","year":"2021","unstructured":"Rumman Chowdhury and Jutta Williams. 2021. Introducing Twitter's first algorithmic bias bounty challenge. Twitter (2021). https:\/\/blog.twitter.com\/engineering\/en_us\/topics\/insights\/2021\/algorithmic-bias-bounty-challenge"},{"key":"e_1_2_1_17_1","doi-asserted-by":"publisher","DOI":"10.1145\/3278156"},{"key":"e_1_2_1_18_1","doi-asserted-by":"publisher","DOI":"10.1145\/1978942.1979213"},{"key":"e_1_2_1_19_1","volume-title":"Excavating AI: The politics of images in machine learning training sets","author":"Crawford Kate","year":"2021","unstructured":"Kate Crawford and Trevor Paglen. 2021. Excavating AI: The politics of images in machine learning training sets. AI & Society (2021), 1--12."},{"key":"e_1_2_1_20_1","volume-title":"Amazon scraps secret AI recruiting tool that showed bias against women","author":"Dastin Jeffrey","year":"2018","unstructured":"Jeffrey Dastin. 2018. Amazon scraps secret AI recruiting tool that showed bias against women. Reuters (Oct. 2018). https:\/\/www.reuters.com\/article\/us-amazon-com-jobs-automation-insight\/amazon-scraps-secret-ai-recruitingtool-that-showed-bias-against-women-idUSKCN1MK08G"},{"key":"e_1_2_1_21_1","volume-title":"The selfish gene","author":"Dawkins Richard","unstructured":"Richard Dawkins. 2016. The selfish gene. Oxford University Press."},{"key":"e_1_2_1_22_1","volume-title":"The practice of everyday life, trans Steven Rendall","author":"Certeau Michel De","unstructured":"Michel De Certeau. 1984. The practice of everyday life, trans Steven Rendall. Berkeley: University of California Press."},{"key":"e_1_2_1_23_1","doi-asserted-by":"publisher","DOI":"10.1145\/3173574.3173694"},{"key":"e_1_2_1_24_1","doi-asserted-by":"publisher","DOI":"10.1145\/3025453.3025659"},{"key":"e_1_2_1_25_1","volume-title":"Algorithmic accountability reporting: On the investigation of black boxes","author":"Diakopoulos Nicholas","year":"2014","unstructured":"Nicholas Diakopoulos. 2014. Algorithmic accountability reporting: On the investigation of black boxes. The Tow Center for Digital Journalism (2014)."},{"key":"e_1_2_1_26_1","doi-asserted-by":"publisher","DOI":"10.1145\/2145204.2145355"},{"key":"e_1_2_1_27_1","doi-asserted-by":"publisher","DOI":"10.1145\/2858036.2858494"},{"key":"e_1_2_1_28_1","doi-asserted-by":"publisher","DOI":"10.1609\/icwsm.v11i1.14898"},{"key":"e_1_2_1_29_1","doi-asserted-by":"publisher","DOI":"10.1145\/3290605.3300724"},{"key":"e_1_2_1_30_1","doi-asserted-by":"publisher","DOI":"10.1145\/2207676.2207711"},{"key":"e_1_2_1_31_1","volume-title":"Judge dismisses suit against Yelp. The Wall Street Journal (Oct","author":"Fowler Geoffrey A.","year":"2011","unstructured":"Geoffrey A. Fowler. 2011. Judge dismisses suit against Yelp. The Wall Street Journal (Oct. 2011). https:\/\/www.wsj.com\/articles\/SB10001424052970204505304577002170423750412"},{"key":"e_1_2_1_32_1","volume-title":"Rethinking the public sphere: A contribution to the critique of actually existing democracy. Social text 25\/26","author":"Fraser Nancy","year":"1990","unstructured":"Nancy Fraser. 1990. Rethinking the public sphere: A contribution to the critique of actually existing democracy. Social text 25\/26 (1990), 56--80."},{"key":"e_1_2_1_33_1","doi-asserted-by":"publisher","DOI":"10.1145\/230538.230561"},{"key":"e_1_2_1_34_1","volume-title":"The Internet has unearthed more racist Google Maps results. The Washington Post (May","author":"Fung Brian","year":"2015","unstructured":"Brian Fung. 2015. The Internet has unearthed more racist Google Maps results. The Washington Post (May 2015). https:\/\/www.washingtonpost.com\/news\/the-switch\/wp\/2015\/05\/21\/the-internet-has-unearthed-more-racistgoogle-maps-results"},{"key":"e_1_2_1_35_1","doi-asserted-by":"publisher","DOI":"10.1080\/1369118X.2016.1153700"},{"key":"e_1_2_1_36_1","doi-asserted-by":"publisher","DOI":"10.1145\/2441776.2441873"},{"key":"e_1_2_1_37_1","doi-asserted-by":"publisher","DOI":"10.7551\/mitpress\/9780262525374.001.0001"},{"key":"e_1_2_1_38_1","volume-title":"Court says Yelp doesn't extort businesses. Forbes (Sept","author":"Goldman Eric","year":"2014","unstructured":"Eric Goldman. 2014. Court says Yelp doesn't extort businesses. Forbes (Sept. 2014). https:\/\/www.forbes.com\/sites\/ericgoldman\/2014\/09\/03\/court-says-yelp-doesnt-extort-businesses"},{"key":"e_1_2_1_39_1","volume-title":"ICML Workshop on Human Interpretability in Machine Learning.","author":"Goodman Bryce","year":"2016","unstructured":"Bryce Goodman and Seth Flaxman. 2016. EU regulations on algorithmic decision-making and a \"right to explanation\". In ICML Workshop on Human Interpretability in Machine Learning."},{"key":"e_1_2_1_40_1","doi-asserted-by":"publisher","DOI":"10.1145\/2702123.2702395"},{"key":"e_1_2_1_41_1","doi-asserted-by":"publisher","DOI":"10.1016\/j.chb.2013.04.016"},{"key":"e_1_2_1_42_1","volume-title":"Google photos labeled black people ?gorillas'. USA Today (June","author":"Guynn Jessica","year":"2015","unstructured":"Jessica Guynn. 2015. Google photos labeled black people ?gorillas'. USA Today (June 2015). https:\/\/www.usatoday.com\/story\/tech\/2015\/07\/01\/google-apologizes-after-photos-identify-black-people-as-gorillas\/29567465"},{"key":"e_1_2_1_43_1","volume-title":"The structural transformation of the public sphere: An inquiry into a category of bourgeois society","author":"Habermas J\u00fcrgen","unstructured":"J\u00fcrgen Habermas. 1989. The structural transformation of the public sphere: An inquiry into a category of bourgeois society. MIT press."},{"key":"e_1_2_1_44_1","doi-asserted-by":"publisher","DOI":"10.1145\/2038558.2038585"},{"key":"e_1_2_1_45_1","doi-asserted-by":"publisher","DOI":"10.1145\/2663716.2663744"},{"key":"e_1_2_1_46_1","volume-title":"Twitter apologises for 'racist' image-cropping algorithm. The Guardian (Sept","author":"Hern Alex","year":"2020","unstructured":"Alex Hern. 2020. Twitter apologises for 'racist' image-cropping algorithm. The Guardian (Sept. 2020). https:\/\/www.theguardian.com\/technology\/2020\/sep\/21\/twitter-apologises-for-racist-image-cropping-algorithm"},{"key":"e_1_2_1_47_1","doi-asserted-by":"publisher","DOI":"10.1145\/3357236.3395427"},{"key":"e_1_2_1_48_1","doi-asserted-by":"publisher","DOI":"10.1145\/3290605.3300830"},{"key":"e_1_2_1_49_1","volume-title":"Big Data: A Report on Algorithmic Systems. Opportunity, and Civil Rights","author":"House White","year":"2016","unstructured":"White House. 2016. Big Data: A Report on Algorithmic Systems. Opportunity, and Civil Rights (2016)."},{"key":"e_1_2_1_50_1","volume-title":"Hilary Mason, Andrew Ng, and Rumman Chowdhury. VentureBeat (Jan","author":"Johnson Khari","year":"2019","unstructured":"Khari Johnson. 2019. AI predictions for 2019 from Yann LeCun, Hilary Mason, Andrew Ng, and Rumman Chowdhury. VentureBeat (Jan 2019). https:\/\/venturebeat.com\/2019\/01\/02\/ai-predictions-for-2019-from-yann-lecun-hilary-masonandrew-ng-and-rumman-chowdhury\/"},{"key":"e_1_2_1_51_1","volume-title":"Apparent racial bias found in Twitter photo algorithm. VentureBeat (Sep","author":"Johnson Khari","year":"2020","unstructured":"Khari Johnson. 2020. Apparent racial bias found in Twitter photo algorithm. VentureBeat (Sep 2020). https:\/\/venturebeat.com\/2020\/09\/20\/apparent-racial-bias-found-in-twitter-photo-algorithm\/"},{"key":"e_1_2_1_52_1","volume-title":"?Yelp is the thug of the Internet'. MuckRock","author":"Kang Inkoo","year":"2013","unstructured":"Inkoo Kang. 2013. Businesses: ?Yelp is the thug of the Internet'. MuckRock (2013). https:\/\/www.muckrock.com\/news\/archives\/2013\/jan\/23\/businesses-yelp-thug-of-the-internet."},{"key":"e_1_2_1_53_1","volume-title":"Read nearly 700 FTC complaints regarding Yelp. MuckRock (Jan","author":"Kang Inkoo","year":"2013","unstructured":"Inkoo Kang. 2013. Read nearly 700 FTC complaints regarding Yelp. MuckRock (Jan 2013). https:\/\/www.muckrock.com\/news\/archives\/2013\/jan\/23\/businesses-yelp-thug-of-the-internet\/"},{"key":"e_1_2_1_54_1","doi-asserted-by":"publisher","DOI":"10.1145\/3476046"},{"key":"e_1_2_1_55_1","doi-asserted-by":"publisher","DOI":"10.1145\/2998181.2998321"},{"key":"e_1_2_1_56_1","doi-asserted-by":"publisher","DOI":"10.1609\/aaai.v31i1.10821"},{"key":"e_1_2_1_57_1","doi-asserted-by":"publisher","DOI":"10.1145\/3411764.3445261"},{"key":"e_1_2_1_58_1","doi-asserted-by":"publisher","DOI":"10.1145\/1557019.1557077"},{"key":"e_1_2_1_59_1","doi-asserted-by":"publisher","DOI":"10.1145\/3366423.3380306"},{"key":"e_1_2_1_60_1","doi-asserted-by":"publisher","DOI":"10.1145\/3313831.3376445"},{"key":"e_1_2_1_61_1","volume-title":"Why Twitter's image cropping algorithm appears to have white bias. TNW | Neural (Mar","author":"Mehta Ivan","year":"2021","unstructured":"Ivan Mehta. 2021. Why Twitter's image cropping algorithm appears to have white bias. TNW | Neural (Mar 2021). https:\/\/thenextweb.com\/news\/why-twitters-image-cropping-algorithm-appears-to-have-white-bias"},{"key":"e_1_2_1_62_1","volume-title":"How Might A.I. Label You? The New York Times (Sep","author":"Metz Cade","year":"2019","unstructured":"Cade Metz. 2019. 'Nerd,' 'Nonsmoker,' 'Wrongdoer': How Might A.I. Label You? The New York Times (Sep 2019). https:\/\/www.nytimes.com\/2019\/09\/20\/arts\/design\/imagenet-trevor-paglen-ai-facial-recognition.html"},{"key":"e_1_2_1_63_1","volume-title":"Clear and convincing evidence: Measurement of discrimination in America","author":"Mincy Ronald B","unstructured":"Ronald B Mincy. 1993. The Urban Institute audit studies: Their research and policy context. In Clear and convincing evidence: Measurement of discrimination in America, M. Fix and R. J. Struyk (Eds.). Urban Institute Press, 165--86."},{"key":"e_1_2_1_64_1","doi-asserted-by":"publisher","DOI":"10.1007\/978-3-319-68612-7_62"},{"key":"e_1_2_1_65_1","volume-title":"Algorithms of oppression: How search engines reinforce racism","author":"Noble Safiya Umoja","unstructured":"Safiya Umoja Noble. 2018. Algorithms of oppression: How search engines reinforce racism. NYU Press."},{"key":"e_1_2_1_66_1","volume-title":"The algorithm that helped google translate become sexist. Forbes (Feb","author":"Olson Parmy","year":"2018","unstructured":"Parmy Olson. 2018. The algorithm that helped google translate become sexist. Forbes (Feb. 2018). https:\/\/www.forbes.com\/sites\/parmyolson\/2018\/02\/15\/the-algorithm-that-helped-google-translate-become-sexist\/'sh=6c1d0807daa2"},{"key":"e_1_2_1_67_1","volume-title":"Third Workshop on the Economics of Information Security. 19--26","author":"Ozment Andy","year":"2004","unstructured":"Andy Ozment. 2004. Bug auctions: Vulnerability markets reconsidered. In Third Workshop on the Economics of Information Security. 19--26."},{"key":"e_1_2_1_68_1","unstructured":"Francesca Palmiotto. 2020. Challenging automated filtering systems - The case of Yelp. https:\/\/ai-laws.org\/en\/2020\/05\/challenging-automated-filtering-systems-the-case-of-yelp\/"},{"key":"e_1_2_1_69_1","volume-title":"Becker","author":"Ragin Charles C.","year":"1992","unstructured":"Charles C. Ragin and Howard S. Becker. 1992. What is a case?: Exploring the foundations of social inquiry. Cambridge University Press."},{"key":"e_1_2_1_70_1","doi-asserted-by":"publisher","DOI":"10.1145\/3306618.3314244"},{"key":"e_1_2_1_71_1","doi-asserted-by":"publisher","DOI":"10.1145\/3351095.3372873"},{"key":"e_1_2_1_72_1","doi-asserted-by":"publisher","DOI":"10.1145\/3274417"},{"key":"e_1_2_1_73_1","volume-title":"A group of YouTubers is trying to prove the site systematically demonetizes Queer content. Vox (Oct","author":"Romano Aja","year":"2019","unstructured":"Aja Romano. 2019. A group of YouTubers is trying to prove the site systematically demonetizes Queer content. Vox (Oct. 2019). https:\/\/www.vox.com\/culture\/2019\/10\/10\/20893258\/youtube-lgbtq-censorship-demonetization-nerdcity-algorithm-report"},{"key":"e_1_2_1_74_1","volume-title":"Auditing algorithms: Research methods for detecting discrimination on internet platforms. Data and Discrimination: Converting Critical Concerns into Productive Inquiry","author":"Sandvig Christian","year":"2014","unstructured":"Christian Sandvig, Kevin Hamilton, Karrie Karahalios, and Cedric Langbort. 2014. Auditing algorithms: Research methods for detecting discrimination on internet platforms. Data and Discrimination: Converting Critical Concerns into Productive Inquiry (2014)."},{"key":"e_1_2_1_75_1","volume-title":"Twitter's AI Revealed to Have Racial Bias After a Viral Photo Experiment. Vice (Sep","author":"Sanjay Satviki","year":"2020","unstructured":"Satviki Sanjay. 2020. Twitter's AI Revealed to Have Racial Bias After a Viral Photo Experiment. Vice (Sep 2020). https:\/\/www.vice.com\/en\/article\/93547v\/twitter-ai-algorithm-has-racial-bias"},{"key":"e_1_2_1_76_1","doi-asserted-by":"publisher","DOI":"10.1007\/3-540-45831-X_6"},{"key":"e_1_2_1_77_1","volume-title":"Weapons of the weak: Everyday forms of peasant resistance","author":"Scott James C","unstructured":"James C Scott. 1985. Weapons of the weak: Everyday forms of peasant resistance. Yale University Press."},{"key":"e_1_2_1_78_1","doi-asserted-by":"publisher","DOI":"10.1177\/2053951717738104"},{"key":"e_1_2_1_79_1","volume-title":"Knowing algorithms. Digital STS","author":"Seaver Nick","year":"2019","unstructured":"Nick Seaver. 2019. Knowing algorithms. Digital STS (2019), 412--422."},{"key":"e_1_2_1_80_1","doi-asserted-by":"publisher","DOI":"10.1145\/3287560.3287598"},{"key":"e_1_2_1_81_1","doi-asserted-by":"publisher","DOI":"10.1111\/jcc4.12013"},{"key":"e_1_2_1_82_1","volume-title":"Proceedings of the ACM on Human-Computer Interaction 4, CSCW3","author":"Simpson Ellen","year":"2021","unstructured":"Ellen Simpson and Bryan Semaan. 2021. For you, or For ?you\"? Everyday LGBTQ+ encounters with TikTok. Proceedings of the ACM on Human-Computer Interaction 4, CSCW3 (2021), 1--34."},{"key":"e_1_2_1_83_1","volume-title":"Plans and situated actions: The problem of human-machine communication","author":"Suchman Lucy A","unstructured":"Lucy A Suchman. 1987. Plans and situated actions: The problem of human-machine communication. Cambridge University Press."},{"key":"e_1_2_1_84_1","doi-asserted-by":"publisher","DOI":"10.1145\/2460276.2460278"},{"key":"e_1_2_1_85_1","unstructured":"Lucas Theis and Zehan Wang. 2018. Speedy Neural Networks for Smart Auto-Cropping of Images. https:\/\/blog.twitter.com\/engineering\/en_us\/topics\/infrastructure\/2018\/Smart-Auto-Cropping-of-Images.html"},{"key":"e_1_2_1_86_1","volume-title":"Australian Uber drivers say the company is manipulating their ratings to boost its fees. Businessinsider (May","author":"Tucker Harry","year":"2016","unstructured":"Harry Tucker. 2016. Australian Uber drivers say the company is manipulating their ratings to boost its fees. Businessinsider (May 2016). https:\/\/www.businessinsider.com.au\/australian-uber-drivers-say-the-company-ismanipulating-their-ratings-to-boost-the-companys-fees-2016--5"},{"key":"e_1_2_1_87_1","volume-title":"Proceedings of the 18th International Conference on Autonomous Agents and MultiAgent Systems. 2238--2240","author":"Vandenhof Colin","year":"2019","unstructured":"Colin Vandenhof and Edith Law. 2019. Contradict the Machine:AHybrid Approach to Identifying Unknown Unknowns. In Proceedings of the 18th International Conference on Autonomous Agents and MultiAgent Systems. 2238--2240."},{"key":"e_1_2_1_88_1","doi-asserted-by":"publisher","DOI":"10.1145\/3173574.3174014"},{"key":"e_1_2_1_89_1","volume-title":"Algorithmic resistance: Media practices and the politics of repair. Information","author":"Velkova Julia","year":"2019","unstructured":"Julia Velkova and Anne Kaun. 2019. Algorithmic resistance: Media practices and the politics of repair. Information, Communication & Society (2019), 1--18."},{"key":"e_1_2_1_90_1","volume-title":"Apple card investigated after gender discrimination complaints. The New York Times (Nov","author":"Vigdor Neil","year":"2019","unstructured":"Neil Vigdor. 2019. Apple card investigated after gender discrimination complaints. The New York Times (Nov. 2019). https:\/\/www.nytimes.com\/2019\/11\/10\/business\/Apple-credit-card-investigation.html"},{"key":"e_1_2_1_91_1","volume-title":"Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a day. The Verge (Mar","author":"Vincent James","year":"2016","unstructured":"James Vincent. 2016. Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a day. The Verge (Mar 2016). https:\/\/www.theverge.com\/2016\/3\/24\/11297050\/tay-microsoft-chatbot-racist"},{"key":"e_1_2_1_92_1","volume-title":"Google 'fixed' its racist algorithm by removing gorillas from its image-labeling tech. The Verge (Jan","author":"Vincent James","year":"2018","unstructured":"James Vincent. 2018. Google 'fixed' its racist algorithm by removing gorillas from its image-labeling tech. The Verge (Jan. 2018). https:\/\/www.theverge.com\/2018\/1\/12\/16882408\/google-racist-gorillas-photo-recognition-algorithm-ai"},{"key":"e_1_2_1_93_1","doi-asserted-by":"publisher","DOI":"10.1145\/3442188.3445885"},{"key":"e_1_2_1_94_1","doi-asserted-by":"publisher","DOI":"10.1080\/1369118X.2018.1490796"},{"key":"e_1_2_1_95_1","volume-title":"Sensemaking in organizations","author":"Weick Karl E","unstructured":"Karl E Weick. 1995. Sensemaking in organizations. Vol. 3. Sage."},{"key":"e_1_2_1_96_1","volume-title":"Google debuts AI in Google Translate that addresses gender bias. VentureBeat (Apr","author":"Wiggers Kyle","year":"2020","unstructured":"Kyle Wiggers. 2020. Google debuts AI in Google Translate that addresses gender bias. VentureBeat (Apr 2020). https:\/\/venturebeat.com\/2020\/04\/22\/google-debuts-ai-in-google-translate-that-addresses-gender-bias"},{"key":"e_1_2_1_97_1","doi-asserted-by":"publisher","DOI":"10.1145\/3313831.3376424"},{"key":"e_1_2_1_98_1","unstructured":"Yelp. 2010. Yelp's Review Filter Explained. https:\/\/blog.yelp.com\/2010\/03\/yelp-review-filter-explained"},{"key":"e_1_2_1_99_1","doi-asserted-by":"publisher","DOI":"10.1007\/s10676-019-09497-z"},{"key":"e_1_2_1_100_1","doi-asserted-by":"publisher","DOI":"10.1038\/d41586-018-05707-8"},{"key":"e_1_2_1_101_1","doi-asserted-by":"publisher","DOI":"10.1145\/3479569"}],"container-title":["Proceedings of the ACM on Human-Computer Interaction"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3479577","content-type":"unspecified","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/dl.acm.org\/doi\/pdf\/10.1145\/3479577","content-type":"application\/pdf","content-version":"vor","intended-application":"syndication"},{"URL":"https:\/\/dl.acm.org\/doi\/pdf\/10.1145\/3479577","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,7,14]],"date-time":"2025-07-14T05:05:14Z","timestamp":1752469514000},"score":1,"resource":{"primary":{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3479577"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2021,10,13]]},"references-count":101,"journal-issue":{"issue":"CSCW2","published-print":{"date-parts":[[2021,10,13]]}},"alternative-id":["10.1145\/3479577"],"URL":"https:\/\/doi.org\/10.1145\/3479577","relation":{},"ISSN":["2573-0142"],"issn-type":[{"value":"2573-0142","type":"electronic"}],"subject":[],"published":{"date-parts":[[2021,10,13]]},"assertion":[{"value":"2021-10-18","order":3,"name":"published","label":"Published","group":{"name":"publication_history","label":"Publication History"}}]}}