{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,2,14]],"date-time":"2026-02-14T10:22:41Z","timestamp":1771064561402,"version":"3.50.1"},"publisher-location":"New York, NY, USA","reference-count":30,"publisher":"ACM","license":[{"start":{"date-parts":[[2019,6,17]],"date-time":"2019-06-17T00:00:00Z","timestamp":1560729600000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/www.acm.org\/publications\/policies\/copyright_policy#Background"}],"funder":[{"DOI":"10.13039\/100000001","name":"NSF","doi-asserted-by":"publisher","award":["CCF-1763311"],"award-info":[{"award-number":["CCF-1763311"]}],"id":[{"id":"10.13039\/100000001","id-type":"DOI","asserted-by":"publisher"}]}],"content-domain":{"domain":["dl.acm.org"],"crossmark-restriction":true},"short-container-title":[],"published-print":{"date-parts":[[2019,6,17]]},"DOI":"10.1145\/3328526.3329624","type":"proceedings-article","created":{"date-parts":[[2019,6,21]],"date-time":"2019-06-21T12:45:07Z","timestamp":1561121107000},"page":"809-824","update-policy":"https:\/\/doi.org\/10.1145\/crossmark-policy","source":"Crossref","is-referenced-by-count":5,"title":["Tracking and Improving Information in the Service of Fairness"],"prefix":"10.1145","author":[{"given":"Sumegha","family":"Garg","sequence":"first","affiliation":[{"name":"Princeton University, Princeton, NJ, USA"}]},{"given":"Michael P.","family":"Kim","sequence":"additional","affiliation":[{"name":"Stanford University, Stanford, CA, USA"}]},{"given":"Omer","family":"Reingold","sequence":"additional","affiliation":[{"name":"Stanford University, Stanford, CA, USA"}]}],"member":"320","published-online":{"date-parts":[[2019,6,17]]},"reference":[{"key":"e_1_3_2_2_1_1","volume-title":"Machine Bias: There's software used across the country to predict future criminals. And it's biased against blacks. ProPublica","author":"Angwin Julia","year":"2016","unstructured":"Julia Angwin , Jeff Larson , Surya Mattu , and Lauren Kirchner . 2016 . Machine Bias: There's software used across the country to predict future criminals. And it's biased against blacks. ProPublica (2016). Julia Angwin, Jeff Larson, Surya Mattu, and Lauren Kirchner. 2016. Machine Bias: There's software used across the country to predict future criminals. And it's biased against blacks. ProPublica (2016)."},{"key":"e_1_3_2_2_2_1","volume-title":"Equivalent comparisons of experiments. The annals of mathematical statistics","author":"Blackwell David","year":"1953","unstructured":"David Blackwell . 1953. Equivalent comparisons of experiments. The annals of mathematical statistics ( 1953 ). David Blackwell. 1953. Equivalent comparisons of experiments. The annals of mathematical statistics (1953)."},{"key":"e_1_3_2_2_3_1","unstructured":"Tolga Bolukbasi Kai-Wei Chang James Y Zou Venkatesh Saligrama and Adam T Kalai. 2016. Man is to computer programmer as woman is to homemaker? debiasing word embeddings. In Neural Information Processing Systems .  Tolga Bolukbasi Kai-Wei Chang James Y Zou Venkatesh Saligrama and Adam T Kalai. 2016. Man is to computer programmer as woman is to homemaker? debiasing word embeddings. In Neural Information Processing Systems ."},{"key":"e_1_3_2_2_4_1","volume-title":"Verification of forecasts expressed in terms of probability. Monthly Weather Review","author":"Brier Glenn W","year":"1950","unstructured":"Glenn W Brier . 1950. Verification of forecasts expressed in terms of probability. Monthly Weather Review ( 1950 ). Glenn W Brier. 1950. Verification of forecasts expressed in terms of probability. Monthly Weather Review (1950)."},{"key":"e_1_3_2_2_5_1","unstructured":"Joy Buolamwini and Timnit Gebru. 2018. Gender shades: Intersectional accuracy disparities in commercial gender classification. In FAT$^*$ .  Joy Buolamwini and Timnit Gebru. 2018. Gender shades: Intersectional accuracy disparities in commercial gender classification. In FAT$^*$ ."},{"key":"e_1_3_2_2_6_1","volume-title":"Why Is My Classifier Discriminatory? Neural Information Processing Systems","author":"Chen Irene","year":"2018","unstructured":"Irene Chen , Fredrik D Johansson , and David Sontag . 2018. Why Is My Classifier Discriminatory? Neural Information Processing Systems ( 2018 ). Irene Chen, Fredrik D Johansson, and David Sontag. 2018. Why Is My Classifier Discriminatory? Neural Information Processing Systems (2018)."},{"key":"e_1_3_2_2_7_1","volume-title":"Fair prediction with disparate impact: A study of bias in recidivism prediction instruments. Big Data","author":"Chouldechova Alexandra","year":"2017","unstructured":"Alexandra Chouldechova . 2017. Fair prediction with disparate impact: A study of bias in recidivism prediction instruments. Big Data ( 2017 ). Alexandra Chouldechova. 2017. Fair prediction with disparate impact: A study of bias in recidivism prediction instruments. Big Data (2017)."},{"key":"e_1_3_2_2_8_1","volume-title":"The measure and mismeasure of fairness: A critical review of fair machine learning. arXiv preprint","author":"Corbett-Davies Sam","year":"1808","unstructured":"Sam Corbett-Davies and Sharad Goel . 2018. The measure and mismeasure of fairness: A critical review of fair machine learning. arXiv preprint 1808 .00023 (2018). Sam Corbett-Davies and Sharad Goel. 2018. The measure and mismeasure of fairness: A critical review of fair machine learning. arXiv preprint 1808.00023 (2018)."},{"key":"e_1_3_2_2_9_1","volume-title":"A simple proof of Blackwell's comparison of experiments theorem. Journal of Economic Theory","author":"Cr\u00e9mer Jacques","year":"1982","unstructured":"Jacques Cr\u00e9mer . 1982. A simple proof of Blackwell's comparison of experiments theorem. Journal of Economic Theory ( 1982 ). Jacques Cr\u00e9mer. 1982. A simple proof of Blackwell's comparison of experiments theorem. Journal of Economic Theory (1982)."},{"key":"e_1_3_2_2_11_1","doi-asserted-by":"publisher","DOI":"10.1145\/2090236.2090255"},{"key":"e_1_3_2_2_12_1","volume-title":"Facebook's Advertising Platform: New Attack Vectors and the Need for Interventions. arXiv preprint","author":"Faizullabhoy Irfan","year":"1803","unstructured":"Irfan Faizullabhoy and Aleksandra Korolova . 2018. Facebook's Advertising Platform: New Attack Vectors and the Need for Interventions. arXiv preprint 1803 .10099 (2018). Irfan Faizullabhoy and Aleksandra Korolova. 2018. Facebook's Advertising Platform: New Attack Vectors and the Need for Interventions. arXiv preprint 1803.10099 (2018)."},{"key":"e_1_3_2_2_13_1","volume-title":"Vohra","author":"Foster Dean P.","year":"1998","unstructured":"Dean P. Foster and Rakesh V . Vohra . 1998 . Asymptotic Calibration. Biometrika ( 1998). Dean P. Foster and Rakesh V. Vohra. 1998. Asymptotic Calibration. Biometrika (1998)."},{"key":"e_1_3_2_2_14_1","volume-title":"Tracking and Improving Information in the Service of Fairness. arXiv preprint arXiv:1904.09942","author":"Garg Sumegha","year":"2019","unstructured":"Sumegha Garg , Michael P Kim , and Omer Reingold . 2019. Tracking and Improving Information in the Service of Fairness. arXiv preprint arXiv:1904.09942 ( 2019 ). Sumegha Garg, Michael P Kim, and Omer Reingold. 2019. Tracking and Improving Information in the Service of Fairness. arXiv preprint arXiv:1904.09942 (2019)."},{"key":"e_1_3_2_2_15_1","volume-title":"Probabilistic forecasts, calibration and sharpness. Journal of the Royal Statistical Society: Series B (Statistical Methodology)","author":"Gneiting Tilmann","year":"2007","unstructured":"Tilmann Gneiting , Fadoua Balabdaoui , and Adrian E Raftery . 2007. Probabilistic forecasts, calibration and sharpness. Journal of the Royal Statistical Society: Series B (Statistical Methodology) ( 2007 ). Tilmann Gneiting, Fadoua Balabdaoui, and Adrian E Raftery. 2007. Probabilistic forecasts, calibration and sharpness. Journal of the Royal Statistical Society: Series B (Statistical Methodology) (2007)."},{"key":"e_1_3_2_2_16_1","doi-asserted-by":"crossref","unstructured":"Tilmann Gneiting and Adrian E Raftery. 2007. Strictly proper scoring rules prediction and estimation. J. Amer. Statist. Assoc. (2007).  Tilmann Gneiting and Adrian E Raftery. 2007. Strictly proper scoring rules prediction and estimation. J. Amer. Statist. Assoc. (2007).","DOI":"10.1198\/016214506000001437"},{"key":"e_1_3_2_2_17_1","unstructured":"Moritz Hardt Eric Price and Nathan Srebro. 2016. Equality of opportunity in supervised learning. In Neural Information Processing Systems .   Moritz Hardt Eric Price and Nathan Srebro. 2016. Equality of opportunity in supervised learning. In Neural Information Processing Systems ."},{"key":"e_1_3_2_2_18_1","volume-title":"Rothblum","author":"Johnson H\u00e9","year":"2018","unstructured":"\u00da rsula H\u00e9 bert- Johnson , Michael P. Kim , Omer Reingold , and Guy N . Rothblum . 2018 . Calibration for the (Computationally-Identifiable) Masses. ICML ( 2018). \u00da rsula H\u00e9 bert-Johnson, Michael P. Kim, Omer Reingold, and Guy N. Rothblum. 2018. Calibration for the (Computationally-Identifiable) Masses. ICML (2018)."},{"key":"e_1_3_2_2_19_1","doi-asserted-by":"crossref","unstructured":"Ben Hutchinson and Margaret Mitchell. 2019. 50 Years of Testing (Un)fairness: Lessons for Machine Learning. In FAT$^*$ .  Ben Hutchinson and Margaret Mitchell. 2019. 50 Years of Testing (Un)fairness: Lessons for Machine Learning. In FAT$^*$ .","DOI":"10.1145\/3287560.3287600"},{"key":"e_1_3_2_2_20_1","volume-title":"Access to Population-Level Signaling as a Source of Inequality. FAT$^*$","author":"Immorlica Nicole","year":"2019","unstructured":"Nicole Immorlica , Katrina Ligett , and Juba Ziani . 2019. Access to Population-Level Signaling as a Source of Inequality. FAT$^*$ ( 2019 ). Nicole Immorlica, Katrina Ligett, and Juba Ziani. 2019. Access to Population-Level Signaling as a Source of Inequality. FAT$^*$ (2019)."},{"key":"e_1_3_2_2_21_1","unstructured":"Matthew Joseph Michael Kearns Jamie H. Morgenstern and Aaron Roth. 2016. Fairness in learning: Classic and contextual bandits. In Neural Information Processing Systems .   Matthew Joseph Michael Kearns Jamie H. Morgenstern and Aaron Roth. 2016. Fairness in learning: Classic and contextual bandits. In Neural Information Processing Systems ."},{"key":"e_1_3_2_2_22_1","volume-title":"Downstream Effects of Affirmative Action. FAT$^*$","author":"Kannan Sampath","year":"2019","unstructured":"Sampath Kannan , Aaron Roth , and Juba Ziani . 2019. Downstream Effects of Affirmative Action. FAT$^*$ ( 2019 ). Sampath Kannan, Aaron Roth, and Juba Ziani. 2019. Downstream Effects of Affirmative Action. FAT$^*$ (2019)."},{"key":"e_1_3_2_2_23_1","volume-title":"Preventing Fairness Gerrymandering: Auditing and Learning for Subgroup Fairness. ICML","author":"Kearns Michael","year":"2018","unstructured":"Michael Kearns , Seth Neel , Aaron Roth , and Zhiwei Steven Wu. 2018. Preventing Fairness Gerrymandering: Auditing and Learning for Subgroup Fairness. ICML ( 2018 ). Michael Kearns, Seth Neel, Aaron Roth, and Zhiwei Steven Wu. 2018. Preventing Fairness Gerrymandering: Auditing and Learning for Subgroup Fairness. ICML (2018)."},{"key":"e_1_3_2_2_24_1","volume-title":"Rothblum","author":"Kim Michael P.","year":"2018","unstructured":"Michael P. Kim , Omer Reingold , and Guy N . Rothblum . 2018 . Fairness Through Computationally-Bounded Awareness. Neural Information Processing Systems ( 2018). Michael P. Kim, Omer Reingold, and Guy N. Rothblum. 2018. Fairness Through Computationally-Bounded Awareness. Neural Information Processing Systems (2018)."},{"key":"e_1_3_2_2_25_1","volume-title":"Algorithmic Fairness. In AEA Papers and Proceedings .","author":"Kleinberg Jon","year":"2018","unstructured":"Jon Kleinberg , Jens Ludwig , Sendhil Mullainathan , and Ashesh Rambachan . 2018 . Algorithmic Fairness. In AEA Papers and Proceedings . Jon Kleinberg, Jens Ludwig, Sendhil Mullainathan, and Ashesh Rambachan. 2018. Algorithmic Fairness. In AEA Papers and Proceedings ."},{"key":"e_1_3_2_2_26_1","doi-asserted-by":"crossref","unstructured":"Jon M. Kleinberg Sendhil Mullainathan and Manish Raghavan. 2017. Inherent Trade-Offs in the Fair Determination of Risk Scores. In ITCS .  Jon M. Kleinberg Sendhil Mullainathan and Manish Raghavan. 2017. Inherent Trade-Offs in the Fair Determination of Risk Scores. In ITCS .","DOI":"10.1145\/3219617.3219634"},{"key":"e_1_3_2_2_27_1","unstructured":"Lydia T. Liu Sarah Dean Esther Rolf Max Simchowitz and Moritz Hardt. 2018. Delayed Impact of Fair Machine Learning. In ICML .  Lydia T. Liu Sarah Dean Esther Rolf Max Simchowitz and Moritz Hardt. 2018. Delayed Impact of Fair Machine Learning. In ICML ."},{"key":"e_1_3_2_2_28_1","volume-title":"Prediction-Based Decisions and Fairness: A Catalogue of Choices, Assumptions, and Definitions. arXiv preprint","author":"Mitchell Shira","year":"1811","unstructured":"Shira Mitchell , Eric Potash , and Solon Barocas . 2018. Prediction-Based Decisions and Fairness: A Catalogue of Choices, Assumptions, and Definitions. arXiv preprint 1811 .07867 (2018). Shira Mitchell, Eric Potash, and Solon Barocas. 2018. Prediction-Based Decisions and Fairness: A Catalogue of Choices, Assumptions, and Definitions. arXiv preprint 1811.07867 (2018)."},{"key":"e_1_3_2_2_29_1","volume-title":"Weinberger","author":"Pleiss Geoff","year":"2017","unstructured":"Geoff Pleiss , Manish Raghavan , Felix Wu , Jon M. Kleinberg , and Kilian Q . Weinberger . 2017 . On Fairness and Calibration. In Neural Information Processing Systems . Geoff Pleiss, Manish Raghavan, Felix Wu, Jon M. Kleinberg, and Kilian Q. Weinberger. 2017. On Fairness and Calibration. In Neural Information Processing Systems ."},{"key":"e_1_3_2_2_30_1","unstructured":"Samira Samadi Uthaipon Tantipongpipat Jamie H Morgenstern Mohit Singh and Santosh Vempala. 2018. The Price of Fair PCA: One Extra Dimension. In Neural Information Processing Systems .   Samira Samadi Uthaipon Tantipongpipat Jamie H Morgenstern Mohit Singh and Santosh Vempala. 2018. The Price of Fair PCA: One Extra Dimension. In Neural Information Processing Systems ."},{"key":"e_1_3_2_2_31_1","volume-title":"Sorelle A. Friedler, Suresh Venkatasubramanian, and Janet Vertesi.","author":"Selbst Andrew D.","year":"2019","unstructured":"Andrew D. Selbst , danah boyd , Sorelle A. Friedler, Suresh Venkatasubramanian, and Janet Vertesi. 2019 . Fairness and Abstraction in Sociotechnical Systems. In FAT $^*$ . Andrew D. Selbst, danah boyd, Sorelle A. Friedler, Suresh Venkatasubramanian, and Janet Vertesi. 2019. Fairness and Abstraction in Sociotechnical Systems. In FAT$^*$ ."}],"event":{"name":"EC '19: ACM Conference on Economics and Computation","location":"Phoenix AZ USA","acronym":"EC '19","sponsor":["SIGecom Special Interest Group on Economics and Computation"]},"container-title":["Proceedings of the 2019 ACM Conference on Economics and Computation"],"original-title":[],"link":[{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3328526.3329624","content-type":"unspecified","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/dl.acm.org\/doi\/pdf\/10.1145\/3328526.3329624","content-type":"application\/pdf","content-version":"vor","intended-application":"syndication"},{"URL":"https:\/\/dl.acm.org\/doi\/pdf\/10.1145\/3328526.3329624","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,6,17]],"date-time":"2025-06-17T23:53:41Z","timestamp":1750204421000},"score":1,"resource":{"primary":{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3328526.3329624"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2019,6,17]]},"references-count":30,"alternative-id":["10.1145\/3328526.3329624","10.1145\/3328526"],"URL":"https:\/\/doi.org\/10.1145\/3328526.3329624","relation":{},"subject":[],"published":{"date-parts":[[2019,6,17]]},"assertion":[{"value":"2019-06-17","order":2,"name":"published","label":"Published","group":{"name":"publication_history","label":"Publication History"}}]}}