{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,2,20]],"date-time":"2026-02-20T23:29:55Z","timestamp":1771630195785,"version":"3.50.1"},"reference-count":76,"publisher":"Association for Computing Machinery (ACM)","issue":"1-4","license":[{"start":{"date-parts":[[2024,3,23]],"date-time":"2024-03-23T00:00:00Z","timestamp":1711152000000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":["dl.acm.org"],"crossmark-restriction":true},"short-container-title":["Trans. Soc. Comput."],"published-print":{"date-parts":[[2024,12,31]]},"abstract":"<jats:p>\n            Social media users create folk theories to help explain how elements of social media operate. Marginalized social media users face disproportionate content moderation and removal on social media platforms. We conducted a qualitative interview study (\n            <jats:italic>n<\/jats:italic>\n            = 24) to understand how marginalized social media users may create folk theories in response to content moderation and their perceptions of platforms\u2019 spirit, and how these theories may relate to their marginalized identities. We found that marginalized social media users develop folk theories informed by their perceptions of platforms\u2019 spirit to explain instances where their content was moderated in ways that violate their perceptions of how content moderation should work in practice. These folk theories typically address content being removed despite not violating community guidelines, along with bias against marginalized users embedded in guidelines. We provide implications for platforms, such as using marginalized users\u2019 folk theories as tools to identify elements of platform moderation systems that function incorrectly and disproportionately impact marginalized users.\n          <\/jats:p>","DOI":"10.1145\/3632741","type":"journal-article","created":{"date-parts":[[2023,12,2]],"date-time":"2023-12-02T12:06:32Z","timestamp":1701518792000},"page":"1-27","update-policy":"https:\/\/doi.org\/10.1145\/crossmark-policy","source":"Crossref","is-referenced-by-count":21,"title":["Content Moderation Folk Theories and Perceptions of Platform Spirit among Marginalized Social Media Users"],"prefix":"10.1145","volume":"7","author":[{"ORCID":"https:\/\/orcid.org\/0009-0008-8394-7991","authenticated-orcid":false,"given":"Samuel","family":"Mayworm","sequence":"first","affiliation":[{"name":"University of Michigan, Ann Arbor, USA"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-1001-2307","authenticated-orcid":false,"given":"Michael Ann","family":"DeVito","sequence":"additional","affiliation":[{"name":"Northeastern University, Boston, USA"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-1329-4242","authenticated-orcid":false,"given":"Daniel","family":"Delmonaco","sequence":"additional","affiliation":[{"name":"University of Michigan, Ann Arbor, USA"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-0520-0218","authenticated-orcid":false,"given":"Hibby","family":"Thach","sequence":"additional","affiliation":[{"name":"University of Michigan, Ann Arbor, USA"}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-6552-4540","authenticated-orcid":false,"given":"Oliver L.","family":"Haimson","sequence":"additional","affiliation":[{"name":"University of Michigan, Ann Arbor, USA"}]}],"member":"320","published-online":{"date-parts":[[2024,3,23]]},"reference":[{"key":"e_1_3_3_2_2","unstructured":"Julia Alexander. 2019. LGBTQ YouTubers are Suing YouTube Over Alleged Discrimination. https:\/\/www.theverge.com\/2019\/8\/14\/20805283\/lgbtq-youtuber-lawsuit-discrimination-alleged-video-recommendations-demonetization"},{"key":"e_1_3_3_3_2","unstructured":"Julia Angwin ProPublica and Hannes Grassegger. 2017. Facebook\u2019s Secret Censorship Rules Protect White Men From Hate Speech But Not Black Children. https:\/\/www.propublica.org\/article\/facebook-hate-speech-censorship-internal-documents-algorithms"},{"key":"e_1_3_3_4_2","volume-title":"Posting Into the Void","author":"Blunt Danielle","year":"2020","unstructured":"Danielle Blunt, Emily Coombes, Shanelle Mullin, and Ariel Wolf. 2020. Posting Into the Void. Technical Report. Hacking\/\/Hustling."},{"key":"e_1_3_3_5_2","unstructured":"Elena Botella. 2019. TikTok Admits It Suppressed Videos by Disabled Queer and Fat Creators. https:\/\/slate.com\/technology\/2019\/12\/tiktok-disabled-users-videos-suppressed.html"},{"key":"e_1_3_3_6_2","doi-asserted-by":"publisher","DOI":"10.1191\/1478088706qp063oa"},{"key":"e_1_3_3_7_2","volume-title":"Making Content Moderation Less Frustrating: How Do Users Experience Explanatory Human and AI Moderation Messages","author":"Calleberg Erik","unstructured":"Erik Calleberg. [n. d.]. Making Content Moderation Less Frustrating: How Do Users Experience Explanatory Human and AI Moderation Messages. Ph. D. Dissertation. S\u00f6dert\u00f6rn University, School of Natural Sciences, Technology and Environmental Studies, Media Technology.http:\/\/sh.diva-portal.org\/smash\/record.jsf?pid=diva2%3A1576614&dswid=5032"},{"key":"e_1_3_3_8_2","doi-asserted-by":"publisher","DOI":"10.3389\/fhumd.2021.626409"},{"key":"e_1_3_3_9_2","doi-asserted-by":"publisher","DOI":"10.4135\/9781452230153"},{"key":"e_1_3_3_10_2","article-title":"Transgender users accuse TikTok of censorship","author":"Criddle Cristina","year":"2020","unstructured":"Cristina Criddle. 2020. Transgender users accuse TikTok of censorship. BBC News (Feb.2020). https:\/\/www.bbc.com\/news\/technology-51474114","journal-title":"BBC News"},{"key":"e_1_3_3_11_2","doi-asserted-by":"publisher","DOI":"10.1145\/3337722.3337754"},{"key":"e_1_3_3_12_2","doi-asserted-by":"publisher","DOI":"10.1145\/3476080"},{"key":"e_1_3_3_13_2","doi-asserted-by":"crossref","unstructured":"Michael Ann DeVito. 2022. How transfeminine TikTok creators navigate the algorithmic trap of visibility via folk theorization. (2022).","DOI":"10.1145\/3555105"},{"key":"e_1_3_3_14_2","doi-asserted-by":"publisher","DOI":"10.1145\/3173574.3173694"},{"key":"e_1_3_3_15_2","doi-asserted-by":"publisher","DOI":"10.1145\/3170427.3186320"},{"key":"e_1_3_3_16_2","doi-asserted-by":"publisher","DOI":"10.1007\/s12119-020-09790-w"},{"key":"e_1_3_3_17_2","first-page":"23","volume-title":"The State of Content Moderation for the LGBTIQA+ Community and the Role of the EU Digital Services Act","author":"Dinar Christina","year":"2021","unstructured":"Christina Dinar. 2021. The State of Content Moderation for the LGBTIQA+ Community and the Role of the EU Digital Services Act. Technical Report. Heinrich-B\u00f6ll-Stiftung. 23 pages."},{"key":"e_1_3_3_18_2","doi-asserted-by":"publisher","DOI":"10.1080\/01972243.2021.1949768"},{"key":"e_1_3_3_19_2","unstructured":"Nick Drewe. 2016. The Hilarious List Of Hashtags Instagram Won\u2019t Let You Search. http:\/\/thedatapack.com\/banned-instagram-hashtags-update\/"},{"key":"e_1_3_3_20_2","unstructured":"Emily Dreyfuss. 2018. Twitter is Indeed Toxic for Women Amnesty Report Says. https:\/\/www.wired.com\/story\/amnesty-report-twitter-abuse-women\/"},{"key":"e_1_3_3_21_2","doi-asserted-by":"publisher","DOI":"10.1177\/01634437221111923"},{"key":"e_1_3_3_22_2","doi-asserted-by":"publisher","unstructured":"Motahhare Eslami Karrie Karahalios Christian Sandvig Kristen Vaccaro Aimee Rickman Kevin Hamilton and Alex Kirlik. 2016. First I \u201clike\u201d it then I hide it: Folk theories of social feeds. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. ACM San Jose California USA 2371\u20132382. 10.1145\/2858036.2858494","DOI":"10.1145\/2858036.2858494"},{"key":"e_1_3_3_23_2","unstructured":"Facebook. 2021. Adult Nudity and Sexual Activity. https:\/\/transparency.fb.com\/policies\/community-standards\/adult-nudity-sexual-activity\/?from=https%3A%2F%2Fm.facebook.com%2Fcommunitystandards%2Fadult_nudity_sexual_activity%2F%3Fprivacy_mutation_token%3DeyJ0eXBlIjowLCJjcmVhdGlvbl90aW1lIjoxNjM3MTU2Mjc3LCJjYWxsc2l0ZV9pZCI6MTUwODA5MTU3OTM3NDQ1OX0%253D%26__m_async_page__%26__big_pipe_on__%26fb_dtsg_ag%3DAQwCeH0YZhbwj16xn88Fks9UHTnrTkr9xlge52JYlpiJuYGu%253A34%253A1624034963%26jazoest%3D25008"},{"key":"e_1_3_3_24_2","unstructured":"Facebook. 2022. Hate Speech. https:\/\/transparency.fb.com\/policies\/community-standards\/hate-speech\/"},{"key":"e_1_3_3_25_2","doi-asserted-by":"publisher","DOI":"10.1145\/3392845"},{"key":"e_1_3_3_26_2","unstructured":"Electronic Frontier Foundation. 2019. EFF Project Shows How People Are Unfairly \u201cTOSsed Out\u201d By Platforms\u2019 Absurd Enforcement of Content Rules. https:\/\/www.eff.org\/press\/releases\/eff-project-shows-how-people-are-unfairly-tossed-out-platforms-absurd-enforcement"},{"key":"e_1_3_3_27_2","doi-asserted-by":"publisher","DOI":"10.2139\/ssrn.2910571"},{"key":"e_1_3_3_28_2","volume-title":"Social Media: Misinformation and Content Moderation Issues for Congress","author":"Gallo Jason","year":"2021","unstructured":"Jason Gallo and Clare Cho. 2021. Social Media: Misinformation and Content Moderation Issues for Congress. Technical Report R46662. Congressional Research Service. https:\/\/crsreports.congress.gov\/product\/pdf\/R\/R46662"},{"key":"e_1_3_3_29_2","doi-asserted-by":"publisher","DOI":"10.1146\/annurev-anthro-081309-145822"},{"key":"e_1_3_3_30_2","doi-asserted-by":"publisher","DOI":"10.1016\/B0-08-043076-7\/01487-X"},{"key":"e_1_3_3_31_2","doi-asserted-by":"publisher","DOI":"10.1080\/14680777.2020.1783807"},{"key":"e_1_3_3_32_2","unstructured":"Shirin Ghaffary. 2021. How TikTok\u2019s Hate Speech Detection Tool set off a Debate About Racial Bias on the App. https:\/\/www.vox.com\/recode\/2021\/7\/7\/22566017\/tiktok-black-creators-ziggi-tyler-debate-about-black-lives-matter-racial-bias-social-media"},{"key":"e_1_3_3_33_2","volume-title":"Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media (illustrated edition ed.)","author":"Gillespie Tarleton","year":"2018","unstructured":"Tarleton Gillespie. 2018. Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media (illustrated edition ed.). Yale University Press, New Haven."},{"key":"e_1_3_3_34_2","doi-asserted-by":"publisher","DOI":"10.1177\/2053951720943234"},{"key":"e_1_3_3_35_2","unstructured":"GLAAD. 2021. GLAAD\u2019s Social Media Safety Index. https:\/\/www.glaad.org\/blog\/glaads-social-media-safety-index"},{"key":"e_1_3_3_36_2","doi-asserted-by":"publisher","DOI":"10.1177\/2053951719897945"},{"key":"e_1_3_3_37_2","first-page":"42","article-title":"The virtues of moderation","volume":"17","author":"Grimmelmann James","year":"2015","unstructured":"James Grimmelmann. 2015. The virtues of moderation. Yale Journal of Law and Technology 17 (2015), 42\u2013109. https:\/\/heinonline.org\/HOL\/P?h=hein.journals\/yjolt17&i=42","journal-title":"Yale Journal of Law and Technology"},{"key":"e_1_3_3_38_2","doi-asserted-by":"publisher","DOI":"10.1145\/3479610"},{"key":"e_1_3_3_39_2","article-title":"Deceiving Google\u2019s perspective API built for detecting toxic comments","author":"Hosseini Hossein","year":"2017","unstructured":"Hossein Hosseini, Sreeram Kannan, Baosen Zhang, and Radha Poovendran. 2017. Deceiving Google\u2019s perspective API built for detecting toxic comments. arXiv:1702.08138 [cs] (Feb.2017). http:\/\/arxiv.org\/abs\/1702.08138","journal-title":"arXiv:1702.08138 [cs]"},{"key":"e_1_3_3_40_2","unstructured":"Alice Hunsberger Vanity Brown and Lily Galib. 2021. Best Practices for Gender-Inclusive Content Moderation. https:\/\/static1.squarespace.com\/static\/5f7cf4654534a21e8041006b\/t\/61773f8407378d02feeb6f0e\/1635205002056\/CX-White+Paper-2021.pdf"},{"key":"e_1_3_3_41_2","doi-asserted-by":"publisher","unstructured":"Olu Jenzen and Irmi Karl. 2014. Make share care: Social media and LGBTQ youth engagement. (2014). 10.7264\/N39P2ZX3","DOI":"10.7264\/N39P2ZX3"},{"key":"e_1_3_3_42_2","doi-asserted-by":"publisher","DOI":"10.1080\/0966369X.2017.1396204"},{"key":"e_1_3_3_43_2","doi-asserted-by":"publisher","DOI":"10.1145\/3359294"},{"key":"e_1_3_3_44_2","doi-asserted-by":"publisher","DOI":"10.1145\/3406865.3418312"},{"key":"e_1_3_3_45_2","doi-asserted-by":"publisher","DOI":"10.1609\/icwsm.v13i01.3229"},{"key":"e_1_3_3_46_2","doi-asserted-by":"publisher","DOI":"10.1609\/aaai.v34i09.7117"},{"key":"e_1_3_3_47_2","doi-asserted-by":"publisher","DOI":"10.1145\/3476046"},{"key":"e_1_3_3_48_2","unstructured":"Knight Foundation. 2020. Americans Support Free Speech Online but Want More Action to Curb Harmful Content. https:\/\/knightfoundation.org\/press\/releases\/americans-support-free-speech-online-but-want-more-action-to-curb-harmful-content\/"},{"key":"e_1_3_3_49_2","unstructured":"Sam Levin. 2016. Facebook Temporarily Blocks Black Lives Matter Activist After he Posts Racist Email. https:\/\/www.theguardian.com\/technology\/2016\/sep\/12\/facebook-blocks-shaun-king-black-lives-matter"},{"key":"e_1_3_3_50_2","doi-asserted-by":"publisher","DOI":"10.1080\/2005615X.2017.1313482"},{"key":"e_1_3_3_51_2","article-title":"Coronavirus and the frailness of platform governance","volume":"9","author":"Magalh\u00e3es Jo\u00e3o Carlos","year":"2020","unstructured":"Jo\u00e3o Carlos Magalh\u00e3es and Christian Katzenbach. 2020. Coronavirus and the frailness of platform governance. Internet Policy Review 9 (March2020). https:\/\/nbn-resolving.org\/urn:nbn:de:0168-ssoar-68143-2","journal-title":"Internet Policy Review"},{"key":"e_1_3_3_52_2","doi-asserted-by":"publisher","DOI":"10.1177\/0959353517717751"},{"key":"e_1_3_3_53_2","first-page":"17","volume-title":"Algorithmic Misogynoir in Content Moderation Practice","author":"Marshall Brandeis","year":"2021","unstructured":"Brandeis Marshall. 2021. Algorithmic Misogynoir in Content Moderation Practice. Technical Report. Heinrich-B\u00f6ll-Stiftung. 17 pages."},{"key":"e_1_3_3_54_2","doi-asserted-by":"publisher","DOI":"10.1073\/pnas.1813486116"},{"key":"e_1_3_3_55_2","doi-asserted-by":"publisher","DOI":"10.1177\/146144804047513"},{"key":"e_1_3_3_56_2","doi-asserted-by":"publisher","DOI":"10.1353\/csd.2017.0040"},{"key":"e_1_3_3_57_2","unstructured":"Adam Mosseri. 2020. An Update on Our Equity Work. https:\/\/about.instagram.com\/blog\/announcements\/updates-on-our-equity-work"},{"key":"e_1_3_3_58_2","doi-asserted-by":"publisher","DOI":"10.1177\/1461444818773059"},{"key":"e_1_3_3_59_2","unstructured":"Jibon Naher An Taehyeon and Kim Juho. 2019. Improving users\u2019 algorithmic Understandability and Trust in Content Moderation. Association for Computing Machinery. https:\/\/kixlab.github.io\/website-files\/2019\/cscw2019-workshop-ContestabilityDesign-paper.pdf"},{"key":"e_1_3_3_60_2","doi-asserted-by":"publisher","DOI":"10.1145\/3572334.3572396"},{"key":"e_1_3_3_61_2","unstructured":"Oversight Board. 2022. Appeal to Shape the Future of Facebook and Instagram. https:\/\/www.oversightboard.com\/appeals-process\/"},{"key":"e_1_3_3_62_2","unstructured":"Oversight Board. 2023. Oversight Board overturns Meta\u2019s original decisions in the \u201dGender identity and nudity\u201d cases. https:\/\/www.oversightboard.com\/news\/1214820616135890-oversight-board-overturns-meta-s-original-decisions-in-the-gender-identity-and-nudity-cases\/"},{"key":"e_1_3_3_63_2","doi-asserted-by":"publisher","DOI":"10.5210\/fm.v23i3.8283"},{"key":"e_1_3_3_64_2","unstructured":"Adi Robertson. 2019. TikTok Prevented Disabled Users\u2019 Videos from Showing up in Feeds. https:\/\/www.theverge.com\/2019\/12\/2\/20991843\/tiktok-bytedance-platform-disabled-autism-lgbt-fat-user-algorithm-reach-limit"},{"key":"e_1_3_3_65_2","doi-asserted-by":"publisher","DOI":"10.2307\/j.ctvc77mp9.30"},{"key":"e_1_3_3_66_2","doi-asserted-by":"publisher","DOI":"10.1145\/2998181.2998277"},{"key":"e_1_3_3_67_2","article-title":"Censorship of marginalized communities on Instagram, 2021","author":"Smith Shakira","year":"2021","unstructured":"Shakira Smith, Oliver L. Haimson, Claire Fitzsimmons, and Nikki Echarte Brown. 2021. Censorship of marginalized communities on Instagram, 2021. Salty (Sept.2021). https:\/\/saltyworld.net\/product\/exclusive-report-censorship-of-marginalized-communities-on-instagram-2021-pdf-download\/","journal-title":"Salty"},{"key":"e_1_3_3_68_2","doi-asserted-by":"publisher","DOI":"10.1177\/1461444815625941"},{"key":"e_1_3_3_69_2","doi-asserted-by":"publisher","DOI":"10.1017\/9781108666428"},{"key":"e_1_3_3_70_2","first-page":"1526","article-title":"What do we mean when we talk about transparency? Toward meaningful transparency in commercial content moderation","volume":"13","author":"Suzor Nicolas P.","year":"2019","unstructured":"Nicolas P. Suzor. 2019. What do we mean when we talk about transparency? Toward meaningful transparency in commercial content moderation. International Journal of Communication 13 (2019), 1526\u20131543.","journal-title":"International Journal of Communication"},{"key":"e_1_3_3_71_2","unstructured":"TikTok. 2022. Community Guidelines. https:\/\/www.tiktok.com\/community-guidelines?lang=en"},{"key":"e_1_3_3_72_2","unstructured":"Twitter. 2022. Hateful Conduct Policy. https:\/\/help.twitter.com\/en\/rules-and-policies\/hateful-conduct-policy"},{"key":"e_1_3_3_73_2","doi-asserted-by":"publisher","DOI":"10.1145\/3415238"},{"key":"e_1_3_3_74_2","unstructured":"Jillian York and Karen Gullo. 2018. Offline\/Online Project Highlights How the Oppression Marginalized Communities Face in the Real World Follows Them Online. https:\/\/www.eff.org\/deeplinks\/2018\/03\/offlineonline-project-highlights-how-oppression-marginalized-communities-face-real"},{"key":"e_1_3_3_75_2","doi-asserted-by":"publisher","DOI":"10.1080\/15205436.2021.1970186"},{"key":"e_1_3_3_76_2","doi-asserted-by":"publisher","DOI":"10.1177\/0163443720972314"},{"key":"e_1_3_3_77_2","doi-asserted-by":"publisher","DOI":"10.1177\/1461444820942483"}],"container-title":["ACM Transactions on Social Computing"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3632741","content-type":"unspecified","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/dl.acm.org\/doi\/pdf\/10.1145\/3632741","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,6,18]],"date-time":"2025-06-18T22:51:04Z","timestamp":1750287064000},"score":1,"resource":{"primary":{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3632741"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2024,3,23]]},"references-count":76,"journal-issue":{"issue":"1-4","published-print":{"date-parts":[[2024,12,31]]}},"alternative-id":["10.1145\/3632741"],"URL":"https:\/\/doi.org\/10.1145\/3632741","relation":{},"ISSN":["2469-7818","2469-7826"],"issn-type":[{"value":"2469-7818","type":"print"},{"value":"2469-7826","type":"electronic"}],"subject":[],"published":{"date-parts":[[2024,3,23]]},"assertion":[{"value":"2023-04-12","order":0,"name":"received","label":"Received","group":{"name":"publication_history","label":"Publication History"}},{"value":"2023-10-30","order":2,"name":"accepted","label":"Accepted","group":{"name":"publication_history","label":"Publication History"}},{"value":"2024-03-23","order":3,"name":"published","label":"Published","group":{"name":"publication_history","label":"Publication History"}}]}}