{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,27]],"date-time":"2026-03-27T19:53:47Z","timestamp":1774641227132,"version":"3.50.1"},"publisher-location":"New York, NY, USA","reference-count":39,"publisher":"ACM","license":[{"start":{"date-parts":[[2021,7,21]],"date-time":"2021-07-21T00:00:00Z","timestamp":1626825600000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/www.acm.org\/publications\/policies\/copyright_policy#Background"}],"content-domain":{"domain":["dl.acm.org"],"crossmark-restriction":true},"short-container-title":[],"published-print":{"date-parts":[[2021,7,21]]},"DOI":"10.1145\/3461702.3462528","type":"proceedings-article","created":{"date-parts":[[2021,7,31]],"date-time":"2021-07-31T01:21:38Z","timestamp":1627694498000},"page":"146-153","update-policy":"https:\/\/doi.org\/10.1145\/crossmark-policy","source":"Crossref","is-referenced-by-count":11,"title":["Fairness and Data Protection Impact Assessments"],"prefix":"10.1145","author":[{"given":"Atoosa","family":"Kasirzadeh","sequence":"first","affiliation":[{"name":"Australian National University &amp; University of Toronto, Canberra, ACT, Australia"}]},{"given":"Damian","family":"Clifford","sequence":"additional","affiliation":[{"name":"Australian National University, Canberra, ACT, Australia"}]}],"member":"320","published-online":{"date-parts":[[2021,7,30]]},"reference":[{"key":"e_1_3_2_1_1_1","volume-title":"Guidelines on Data Protection Impact Assessment (DPIA) and Determining Whether Processing Is--Likely to Result in a High Risk?-\u00f9 for the Purposes of Regulation 2016\/679 (No WP248 rev.01","author":"WP.","year":"2017","unstructured":"A29 WP. 2017. Article 29 Working Party , Guidelines on Data Protection Impact Assessment (DPIA) and Determining Whether Processing Is--Likely to Result in a High Risk?-\u00f9 for the Purposes of Regulation 2016\/679 (No WP248 rev.01 , 4 October 2017 ) 1. (2017). A29WP. 2017. Article 29 Working Party, Guidelines on Data Protection Impact Assessment (DPIA) and Determining Whether Processing Is--Likely to Result in a High Risk?-\u00f9 for the Purposes of Regulation 2016\/679 (No WP248 rev.01, 4 October 2017) 1. (2017)."},{"key":"e_1_3_2_1_2_1","volume-title":"Machine Bias: There's Software Used Across the Country to Predict Future Criminals and It's Biased Against Blacks. ProPublica","author":"Angwin Julia","year":"2016","unstructured":"Julia Angwin , Larson Jeff , Mattu Surya , and Kirchner Lauren . 2016 . Machine Bias: There's Software Used Across the Country to Predict Future Criminals and It's Biased Against Blacks. ProPublica (2016). https:\/\/www.propublica.org\/article\/machine-bias-risk-assessments-in-criminal-sentencing. Julia Angwin, Larson Jeff, Mattu Surya, and Kirchner Lauren. 2016. Machine Bias: There's Software Used Across the Country to Predict Future Criminals and It's Biased Against Blacks. ProPublica (2016). https:\/\/www.propublica.org\/article\/machine-bias-risk-assessments-in-criminal-sentencing."},{"key":"e_1_3_2_1_3_1","doi-asserted-by":"publisher","DOI":"10.1177\/0049124118782533"},{"key":"e_1_3_2_1_4_1","volume-title":"Conference on Fairness, Accountability and Transparency. 77--91","author":"Buolamwini Joy","year":"2018","unstructured":"Joy Buolamwini and Timnit Gebru . 2018 . Gender shades: Intersectional accuracy disparities in commercial gender classification . In Conference on Fairness, Accountability and Transparency. 77--91 . Joy Buolamwini and Timnit Gebru. 2018. Gender shades: Intersectional accuracy disparities in commercial gender classification. In Conference on Fairness, Accountability and Transparency. 77--91."},{"key":"e_1_3_2_1_5_1","doi-asserted-by":"publisher","DOI":"10.1016\/j.clsr.2018.01.004"},{"key":"e_1_3_2_1_6_1","series-title":"Information Law Series","volume-title":"Data protection law: approaching its rationale, logic and limits,(Vol. 10)","author":"Bygrave LA","year":"2002","unstructured":"LA Bygrave . 2002. Data protection law: approaching its rationale, logic and limits,(Vol. 10) . Information Law Series . The Hague : Kluwer Law International ( 2002 ). LA Bygrave. 2002. Data protection law: approaching its rationale, logic and limits,(Vol. 10). Information Law Series. The Hague: Kluwer Law International (2002)."},{"key":"e_1_3_2_1_7_1","volume-title":"Fair prediction with disparate impact: A study of bias in recidivism prediction instruments. Big data","author":"Chouldechova Alexandra","year":"2017","unstructured":"Alexandra Chouldechova . 2017. Fair prediction with disparate impact: A study of bias in recidivism prediction instruments. Big data , Vol. 5 , 2 ( 2017 ), 153--163. Alexandra Chouldechova. 2017. Fair prediction with disparate impact: A study of bias in recidivism prediction instruments. Big data, Vol. 5, 2 (2017), 153--163."},{"key":"e_1_3_2_1_8_1","volume-title":"Courts and predictive algorithms. Data & Civil Right: Criminal Justice and Civil Rights Primer","author":"Christin Ang\u00e8le","year":"2015","unstructured":"Ang\u00e8le Christin , Alex Rosenblat , and Danah Boyd . 2015. Courts and predictive algorithms. Data & Civil Right: Criminal Justice and Civil Rights Primer ( 2015 ). Ang\u00e8le Christin, Alex Rosenblat, and Danah Boyd. 2015. Courts and predictive algorithms. Data & Civil Right: Criminal Justice and Civil Rights Primer (2015)."},{"key":"e_1_3_2_1_9_1","doi-asserted-by":"publisher","DOI":"10.1093\/yel\/yey004"},{"key":"e_1_3_2_1_10_1","doi-asserted-by":"publisher","DOI":"10.1017\/glj.2019.56"},{"key":"e_1_3_2_1_11_1","volume-title":"Privacy Impact Assessment Application to IOT Devices.","author":"CNIL.","year":"2018","unstructured":"CNIL. 2018. Commission Nationale Informatique & Libertes , Privacy Impact Assessment Application to IOT Devices. ( 2018 ). CNIL. 2018. Commission Nationale Informatique & Libertes, Privacy Impact Assessment Application to IOT Devices. (2018)."},{"key":"e_1_3_2_1_12_1","doi-asserted-by":"publisher","DOI":"10.1145\/3097983.3098095"},{"key":"e_1_3_2_1_13_1","volume-title":"Data Protection Commission","author":"DPC.","year":"2019","unstructured":"DPC. October 2019. Data Protection Commission , Guidance Note : Guide to Data Protection Impact Assessments (DPIAs) . ( October 2019 ). DPC. October 2019. Data Protection Commission, Guidance Note: Guide to Data Protection Impact Assessments (DPIAs). ( October 2019)."},{"key":"e_1_3_2_1_14_1","doi-asserted-by":"publisher","DOI":"10.1145\/2090236.2090255"},{"key":"e_1_3_2_1_15_1","volume-title":"Guidelines 4\/2019 on Article 25 Data Protection by Design and by Default (Adopted on","author":"EDPB.","year":"2020","unstructured":"EDPB. 2020. European Data Protection Board , Guidelines 4\/2019 on Article 25 Data Protection by Design and by Default (Adopted on 20 October 2020 ) 1. (2020). EDPB. 2020. European Data Protection Board, Guidelines 4\/2019 on Article 25 Data Protection by Design and by Default (Adopted on 20 October 2020) 1. (2020)."},{"key":"e_1_3_2_1_16_1","unstructured":"EDPS. 2014. European Data Protection Supervisor Privacy and Competitiveness in the Age of Big Data: The Interplay Between Data Protection Competition Law and Consumer Protection in the Digital Economy .European Data Protection Supervisor.  EDPS. 2014. European Data Protection Supervisor Privacy and Competitiveness in the Age of Big Data: The Interplay Between Data Protection Competition Law and Consumer Protection in the Digital Economy .European Data Protection Supervisor."},{"key":"e_1_3_2_1_17_1","volume-title":"Survey on Data Protection Impact Assessments under Article 39 of the Regulation (case 2020-0066","author":"EDPS.","year":"2020","unstructured":"EDPS. 2020. European Data Protection Supervisor , Survey on Data Protection Impact Assessments under Article 39 of the Regulation (case 2020-0066 , 6 July 2020 1). (2020). EDPS. 2020. European Data Protection Supervisor, Survey on Data Protection Impact Assessments under Article 39 of the Regulation (case 2020-0066, 6 July 2020 1). (2020)."},{"key":"e_1_3_2_1_18_1","doi-asserted-by":"publisher","DOI":"10.1145\/3287560.3287589"},{"key":"e_1_3_2_1_19_1","doi-asserted-by":"crossref","unstructured":"Philipp Hacker. 2018. Teaching fairness to artificial intelligence: Existing and novel strategies against algorithmic discrimination under EU law. (2018).  Philipp Hacker. 2018. Teaching fairness to artificial intelligence: Existing and novel strategies against algorithmic discrimination under EU law. (2018).","DOI":"10.54648\/COLA2018095"},{"key":"e_1_3_2_1_20_1","unstructured":"Moritz Hardt Eric Price and Nati Srebro. 2016. Equality of opportunity in supervised learning. In Advances in neural information processing systems. 3315--3323.  Moritz Hardt Eric Price and Nati Srebro. 2016. Equality of opportunity in supervised learning. In Advances in neural information processing systems. 3315--3323."},{"key":"e_1_3_2_1_21_1","article-title":"The perfect match? A closer look at the relationship between EU consumer law and data protection law","volume":"54","author":"Helberger Natali","year":"2017","unstructured":"Natali Helberger , Frederik Zuiderveen Borgesius , and Agustin Reyna . 2017 . The perfect match? A closer look at the relationship between EU consumer law and data protection law . Common Market Law Review , Vol. 54 , 5 (2017). Natali Helberger, Frederik Zuiderveen Borgesius, and Agustin Reyna. 2017. The perfect match? A closer look at the relationship between EU consumer law and data protection law. Common Market Law Review, Vol. 54, 5 (2017).","journal-title":"Common Market Law Review"},{"key":"e_1_3_2_1_22_1","volume-title":"Sample DPIA Template (No20180209v0.3).","author":"ICO.","year":"2018","unstructured":"ICO. 2018. Information Commissioner?-\u00f4s Office , Sample DPIA Template (No20180209v0.3). ( 2018 ). ICO. 2018. Information Commissioner?-\u00f4s Office, Sample DPIA Template (No20180209v0.3). (2018)."},{"key":"e_1_3_2_1_23_1","volume-title":"Guidance on AI and data protection (20203006 0.0.39). ICO","author":"ICO.","year":"2020","unstructured":"ICO. 2020. Information Commissioner?-\u00f4s Office , Guidance on AI and data protection (20203006 0.0.39). ICO ( 2020 ). ICO. 2020. Information Commissioner?-\u00f4s Office, Guidance on AI and data protection (20203006 0.0.39). ICO (2020)."},{"key":"e_1_3_2_1_24_1","first-page":"1","article-title":"Information Commissioner?-\u00f4s Office","volume":"2021101","author":"ICO.","year":"2021","unstructured":"ICO. 2021 . Information Commissioner?-\u00f4s Office , Guidance on Data Protection Impact Assessments (DPIAs) (No 2021101 1 .1.64). (2021). ICO. 2021. Information Commissioner?-\u00f4s Office, Guidance on Data Protection Impact Assessments (DPIAs) (No 2021101 1.1.64). (2021).","journal-title":"Guidance on Data Protection Impact Assessments (DPIAs) (No"},{"key":"e_1_3_2_1_25_1","doi-asserted-by":"publisher","DOI":"10.1093\/idpl\/ipz028"},{"key":"e_1_3_2_1_26_1","doi-asserted-by":"crossref","unstructured":"Atoosa Kasirzadeh and Andrew Smart. 2021. A critique of the use of counterfactuals in fair machine learning. In Fairness accountability and Transparency (ACM).  Atoosa Kasirzadeh and Andrew Smart. 2021. A critique of the use of counterfactuals in fair machine learning. In Fairness accountability and Transparency (ACM).","DOI":"10.1145\/3442188.3445886"},{"key":"e_1_3_2_1_27_1","doi-asserted-by":"publisher","DOI":"10.1016\/j.jbankfin.2010.06.001"},{"key":"e_1_3_2_1_28_1","doi-asserted-by":"publisher","DOI":"10.1145\/3219617.3219634"},{"key":"e_1_3_2_1_29_1","volume-title":"On decision transparency, or how to enhance data protection after the computational turn. Privacy, due process and the computational turn: the philosophy of law meets the philosophy of technology","author":"Koops Bert-Jaap","year":"2013","unstructured":"Bert-Jaap Koops . 2013. On decision transparency, or how to enhance data protection after the computational turn. Privacy, due process and the computational turn: the philosophy of law meets the philosophy of technology ( 2013 ), 189--213. Bert-Jaap Koops. 2013. On decision transparency, or how to enhance data protection after the computational turn. Privacy, due process and the computational turn: the philosophy of law meets the philosophy of technology (2013), 189--213."},{"key":"e_1_3_2_1_30_1","doi-asserted-by":"crossref","unstructured":"Elini Kosta. 2020. Artilce 35. Data Protection Impact Assessment. In The EU General Data ProtectionRegulation (GDPR): A Commentary (Christopher Kuner et al.).  Elini Kosta. 2020. Artilce 35. Data Protection Impact Assessment. In The EU General Data ProtectionRegulation (GDPR): A Commentary (Christopher Kuner et al.).","DOI":"10.1093\/oso\/9780198826491.003.0072"},{"key":"e_1_3_2_1_31_1","unstructured":"Matt J Kusner Joshua Loftus Chris Russell and Ricardo Silva. 2017. Counterfactual fairness. In Advances in Neural Information Processing Systems. 4066--4076.  Matt J Kusner Joshua Loftus Chris Russell and Ricardo Silva. 2017. Counterfactual fairness. In Advances in Neural Information Processing Systems. 4066--4076."},{"key":"e_1_3_2_1_32_1","doi-asserted-by":"publisher","DOI":"10.1017\/S0020589314000244"},{"key":"e_1_3_2_1_33_1","doi-asserted-by":"publisher","DOI":"10.1145\/3351095.3372868"},{"key":"e_1_3_2_1_34_1","volume-title":"A survey on bias and fairness in machine learning. arXiv preprint arXiv:1908.09635","author":"Mehrabi Ninareh","year":"2019","unstructured":"Ninareh Mehrabi , Fred Morstatter , Nripsuta Saxena , Kristina Lerman , and Aram Galstyan . 2019. A survey on bias and fairness in machine learning. arXiv preprint arXiv:1908.09635 ( 2019 ). Ninareh Mehrabi, Fred Morstatter, Nripsuta Saxena, Kristina Lerman, and Aram Galstyan. 2019. A survey on bias and fairness in machine learning. arXiv preprint arXiv:1908.09635 (2019)."},{"key":"e_1_3_2_1_35_1","doi-asserted-by":"publisher","DOI":"10.1098\/rsta.2018.0089"},{"key":"e_1_3_2_1_36_1","volume-title":"Science","volume":"366","author":"Obermeyer Ziad","year":"2019","unstructured":"Ziad Obermeyer , Brian Powers , Christine Vogeli , and Sendhil Mullainathan . 2019 . Dissecting racial bias in an algorithm used to manage the health of populations . Science , Vol. 366 , 6464 (2019), 447--453. Ziad Obermeyer, Brian Powers, Christine Vogeli, and Sendhil Mullainathan. 2019. Dissecting racial bias in an algorithm used to manage the health of populations. Science, Vol. 366, 6464 (2019), 447--453."},{"key":"e_1_3_2_1_37_1","volume-title":"Artificial Intelligence, Machine Learning and Data Protection (No 20170904). Version: 2.2","author":"Information Commission","year":"2017","unstructured":"Information Commission er?- \u00f4s Office . 2017. Big Data , Artificial Intelligence, Machine Learning and Data Protection (No 20170904). Version: 2.2 ( 2017 ). Information Commissioner?-\u00f4s Office. 2017. Big Data, Artificial Intelligence, Machine Learning and Data Protection (No 20170904). Version: 2.2 (2017)."},{"key":"e_1_3_2_1_38_1","unstructured":"Geoff Pleiss Manish Raghavan Felix Wu Jon Kleinberg and Kilian Q Weinberger. 2017. On fairness and calibration. In Advances in Neural Information Processing Systems. 5680--5689.  Geoff Pleiss Manish Raghavan Felix Wu Jon Kleinberg and Kilian Q Weinberger. 2017. On fairness and calibration. In Advances in Neural Information Processing Systems. 5680--5689."},{"key":"e_1_3_2_1_39_1","volume-title":"A study of the implications of advanced digital technologies (including AI systems) for the concept of responsibility within a human rights framework","author":"Yeung Karen","year":"2018","unstructured":"Karen Yeung . 2018. A study of the implications of advanced digital technologies (including AI systems) for the concept of responsibility within a human rights framework . Council of Europe) I-AUT 2018 ) 05 29 (2018). Karen Yeung. 2018. A study of the implications of advanced digital technologies (including AI systems) for the concept of responsibility within a human rights framework. Council of Europe) I-AUT2018) 05 29 (2018)."}],"event":{"name":"AIES '21: AAAI\/ACM Conference on AI, Ethics, and Society","location":"Virtual Event USA","acronym":"AIES '21","sponsor":["SIGAI ACM Special Interest Group on Artificial Intelligence","AAAI"]},"container-title":["Proceedings of the 2021 AAAI\/ACM Conference on AI, Ethics, and Society"],"original-title":[],"link":[{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3461702.3462528","content-type":"unspecified","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/dl.acm.org\/doi\/pdf\/10.1145\/3461702.3462528","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,6,17]],"date-time":"2025-06-17T20:49:06Z","timestamp":1750193346000},"score":1,"resource":{"primary":{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3461702.3462528"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2021,7,21]]},"references-count":39,"alternative-id":["10.1145\/3461702.3462528","10.1145\/3461702"],"URL":"https:\/\/doi.org\/10.1145\/3461702.3462528","relation":{},"subject":[],"published":{"date-parts":[[2021,7,21]]},"assertion":[{"value":"2021-07-30","order":2,"name":"published","label":"Published","group":{"name":"publication_history","label":"Publication History"}}]}}