{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,4,22]],"date-time":"2026-04-22T19:18:00Z","timestamp":1776885480782,"version":"3.51.2"},"publisher-location":"New York, NY, USA","reference-count":61,"publisher":"ACM","license":[{"start":{"date-parts":[[2020,11,8]],"date-time":"2020-11-08T00:00:00Z","timestamp":1604793600000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/www.acm.org\/publications\/policies\/copyright_policy#Background"}],"content-domain":{"domain":["dl.acm.org"],"crossmark-restriction":true},"short-container-title":[],"published-print":{"date-parts":[[2020,11,8]]},"DOI":"10.1145\/3368089.3409697","type":"proceedings-article","created":{"date-parts":[[2020,11,8]],"date-time":"2020-11-08T06:03:52Z","timestamp":1604815432000},"page":"654-665","update-policy":"https:\/\/doi.org\/10.1145\/crossmark-policy","source":"Crossref","is-referenced-by-count":121,"title":["Fairway: a way to build fair ML software"],"prefix":"10.1145","author":[{"given":"Joymallya","family":"Chakraborty","sequence":"first","affiliation":[{"name":"North Carolina State University, USA"}]},{"given":"Suvodeep","family":"Majumder","sequence":"additional","affiliation":[{"name":"North Carolina State University, USA"}]},{"given":"Zhe","family":"Yu","sequence":"additional","affiliation":[{"name":"North Carolina State University, USA"}]},{"given":"Tim","family":"Menzies","sequence":"additional","affiliation":[{"name":"North Carolina State University, USA"}]}],"member":"320","published-online":{"date-parts":[[2020,11,8]]},"reference":[{"key":"e_1_3_2_2_1_1","volume-title":"CNBC","year":"2018","unstructured":"\u201c Health care start-up says a.i. can diagnose patients better than humans can, doctors call that 'dubious ',\u201d CNBC , June 2018 . [Online]. Available : https:\/\/www.cnbc.com\/ 2018 \/06\/28\/babylon-claims-its-ai-can-diagnosepatients-better-than-doctors.html \u201cHealth care start-up says a.i. can diagnose patients better than humans can, doctors call that 'dubious',\u201d CNBC, June 2018. [Online]. Available: https:\/\/www.cnbc.com\/ 2018 \/06\/28\/babylon-claims-its-ai-can-diagnosepatients-better-than-doctors.html"},{"key":"e_1_3_2_2_2_1","doi-asserted-by":"publisher","DOI":"10.1109\/MSPEC.2016.7473150"},{"key":"e_1_3_2_2_3_1","volume-title":"On orbitz, mac users steered to pricier hotels","year":"2012","unstructured":"\u201c On orbitz, mac users steered to pricier hotels ,\u201d 2012 . [Online]. Available: https: \/\/www.wsj.com\/articles\/SB10001424052702304458604577488822667325882 \u201c On orbitz, mac users steered to pricier hotels ,\u201d 2012. [Online]. Available: https: \/\/www.wsj.com\/articles\/SB10001424052702304458604577488822667325882"},{"key":"e_1_3_2_2_4_1","volume-title":"The algorithm that beats your bank manager","year":"2011","unstructured":"\u201c The algorithm that beats your bank manager ,\u201d 2011 . [Online]. Available: https:\/\/www.forbes.com\/sites\/parmyolson\/2011\/03\/15\/the-algorithm-thatbeats-your-bank-manager\/#15da2651ae99 \u201cThe algorithm that beats your bank manager ,\u201d 2011. [Online]. Available: https:\/\/www.forbes.com\/sites\/parmyolson\/2011\/03\/15\/the-algorithm-thatbeats-your-bank-manager\/#15da2651ae99"},{"key":"e_1_3_2_2_5_1","volume-title":"Machine bias: There's software used across the country to predict future criminals. and it's biased against blacks","year":"2016","unstructured":"\u201c Machine bias: There's software used across the country to predict future criminals. and it's biased against blacks ,\u201d 2016 . [Online]. Available: https:\/\/www. propublica.org\/article\/machine-bias-risk-assessments-in-criminal-sentencing \u201c Machine bias: There's software used across the country to predict future criminals. and it's biased against blacks ,\u201d 2016. [Online]. Available: https:\/\/www. propublica.org\/article\/machine-bias-risk-assessments-in-criminal-sentencing"},{"key":"e_1_3_2_2_6_1","volume-title":"Can you program ethics into a self-driving car ?","year":"2016","unstructured":"\u201c Can you program ethics into a self-driving car ? \u201d 2016 . [Online]. Available: https:\/\/spectrum.ieee.org\/transportation\/self-driving\/ can-you-programethics-into-a-selfdriving-car \u201cCan you program ethics into a self-driving car ?\u201d 2016. [Online]. Available: https:\/\/spectrum.ieee.org\/transportation\/self-driving\/ can-you-programethics-into-a-selfdriving-car"},{"key":"e_1_3_2_2_7_1","volume-title":"Themis: Automatically testing software for discrimination","author":"Angell R.","unstructured":"R. Angell , B. Johnson , Y. Brun , and A. Meliou , \u201c Themis: Automatically testing software for discrimination ,\u201d ser. ESEC\/FSE 18. R. Angell, B. Johnson, Y. Brun, and A. Meliou, \u201cThemis: Automatically testing software for discrimination,\u201d ser. ESEC\/FSE 18."},{"key":"e_1_3_2_2_8_1","doi-asserted-by":"publisher","DOI":"10.1145\/3236024.3264838"},{"key":"e_1_3_2_2_9_1","unstructured":"\u201cAcm conference on fairness accountability and transparency (acm fat* ).\u201d [Online]. Available: https:\/\/fatconference.org\/  \u201cAcm conference on fairness accountability and transparency (acm fat* ).\u201d [Online]. Available: https:\/\/fatconference.org\/"},{"key":"e_1_3_2_2_10_1","unstructured":"\u201cExplain 2019.\u201d [Online]. Available: https:\/\/2019.ase-conferences.org\/home\/ explain-2019  \u201cExplain 2019.\u201d [Online]. Available: https:\/\/2019.ase-conferences.org\/home\/ explain-2019"},{"key":"e_1_3_2_2_11_1","doi-asserted-by":"publisher","DOI":"10.1145\/3338906.3338937"},{"key":"e_1_3_2_2_12_1","volume-title":"White men account for 72% of corporate leadership at 16 of the fortune 500 companies","year":"2017","unstructured":"\u201c White men account for 72% of corporate leadership at 16 of the fortune 500 companies ,\u201d 2017 . [Online]. Available: https:\/\/fortune.com\/ 2017 \/06\/09\/whitemen-senior-executives-fortune-500-companies-diversity-data\/ \u201cWhite men account for 72% of corporate leadership at 16 of the fortune 500 companies ,\u201d 2017. [Online]. Available: https:\/\/fortune.com\/ 2017 \/06\/09\/whitemen-senior-executives-fortune-500-companies-diversity-data\/"},{"key":"e_1_3_2_2_13_1","volume-title":"Why is my classifier discriminatory?","author":"Chen I.","year":"2018","unstructured":"I. Chen , F. D. Johansson , and D. Sontag , \u201c Why is my classifier discriminatory? \u201d 2018 . I. Chen, F. D. Johansson, and D. Sontag, \u201cWhy is my classifier discriminatory?\u201d 2018."},{"key":"e_1_3_2_2_14_1","volume-title":"A convex framework for fair regression","author":"Berk R.","year":"2017","unstructured":"R. Berk , H. Heidari , S. Jabbari , M. Joseph , M. Kearns , J. Morgenstern , S. Neel , and A. Roth , \u201c A convex framework for fair regression ,\u201d 2017 . R. Berk, H. Heidari, S. Jabbari, M. Joseph, M. Kearns, J. Morgenstern, S. Neel, and A. Roth, \u201c A convex framework for fair regression ,\u201d 2017."},{"key":"e_1_3_2_2_15_1","volume-title":"Fairway","author":"Chakraborty J.","year":"2020","unstructured":"J. Chakraborty , \u201c Fairway ,\u201d 6 2020 . [Online]. Available: https:\/\/figshare.com\/ articles\/software\/Fairway\/12521408 J. Chakraborty, \u201cFairway,\u201d 6 2020. [Online]. Available: https:\/\/figshare.com\/ articles\/software\/Fairway\/12521408"},{"key":"e_1_3_2_2_16_1","volume-title":"Motherboard","year":"2017","unstructured":"\u201c Google's sentiment analyzer thinks being gay is bad ,\u201d Motherboard , Oct 2017 . [Online]. Available : https:\/\/bit.ly\/2yMax8V \u201cGoogle's sentiment analyzer thinks being gay is bad ,\u201d Motherboard, Oct 2017. [Online]. Available: https:\/\/bit.ly\/2yMax8V"},{"key":"e_1_3_2_2_17_1","volume-title":"Google apologizes for mis-tagging photos of african americans","year":"2015","unstructured":"\u201c Google apologizes for mis-tagging photos of african americans , \u201d July 2015 . [Online]. Available: https:\/\/cbsn.ws\/2LBYbdy \u201c Google apologizes for mis-tagging photos of african americans, \u201d July 2015. [Online]. Available: https:\/\/cbsn.ws\/2LBYbdy"},{"key":"e_1_3_2_2_18_1","doi-asserted-by":"publisher","DOI":"10.1126\/science.aal4230"},{"key":"e_1_3_2_2_19_1","doi-asserted-by":"publisher","DOI":"10.18653\/v1\/W17-1606"},{"key":"e_1_3_2_2_20_1","volume-title":"Study finds gender and skin-type bias in commercial artificial-intelligence systems","year":"2018","unstructured":"\u201c Study finds gender and skin-type bias in commercial artificial-intelligence systems ,\u201d 2018 . [Online]. Available: http:\/\/news.mit.edu\/2018\/study-finds-genderskin-type-bias-artificial-intelligence-systems-0212 \u201c Study finds gender and skin-type bias in commercial artificial-intelligence systems ,\u201d 2018. [Online]. Available: http:\/\/news.mit.edu\/2018\/study-finds-genderskin-type-bias-artificial-intelligence-systems-0212"},{"key":"e_1_3_2_2_21_1","volume-title":"Machine bias","year":"2016","unstructured":"\u201c Machine bias ,\u201d www.propublica.org, May 2016 . [Online]. Available: https:\/\/www. propublica.org\/article\/machine-bias-risk-assessments-in-criminal-sentencing \u201cMachine bias,\u201d www.propublica.org, May 2016. [Online]. Available: https:\/\/www. propublica.org\/article\/machine-bias-risk-assessments-in-criminal-sentencing"},{"key":"e_1_3_2_2_22_1","volume-title":"Amazon scraps secret ai recruiting tool that showed bias against women","year":"2018","unstructured":"\u201c Amazon scraps secret ai recruiting tool that showed bias against women , \u201d Oct 2018 . [Online]. Available: https:\/\/www.reuters.com\/article\/us-amazoncom-jobs-automation-insight\/amazon-scraps-secret-ai-recruiting-tool-thatshowed-bias-against-women-idUSKCN1MK08G \u201c Amazon scraps secret ai recruiting tool that showed bias against women, \u201d Oct 2018. [Online]. Available: https:\/\/www.reuters.com\/article\/us-amazoncom-jobs-automation-insight\/amazon-scraps-secret-ai-recruiting-tool-thatshowed-bias-against-women-idUSKCN1MK08G"},{"key":"e_1_3_2_2_23_1","volume-title":"Ethically-aligned design: A vision for prioritizing human well-begin with autonomous and intelligence systems","year":"2019","unstructured":"\u201c Ethically-aligned design: A vision for prioritizing human well-begin with autonomous and intelligence systems .\u201d 2019 . \u201c Ethically-aligned design: A vision for prioritizing human well-begin with autonomous and intelligence systems.\u201d 2019."},{"key":"e_1_3_2_2_24_1","volume-title":"Ethics guidelines for trustworthy artificial intelligence","year":"2018","unstructured":"\u201c Ethics guidelines for trustworthy artificial intelligence .\u201d 2018 . [Online]. Available: https:\/\/ec.europa.eu\/digital-single-market\/en\/news\/ethics-guidelinestrustworthy-ai \u201cEthics guidelines for trustworthy artificial intelligence.\u201d 2018. [Online]. Available: https:\/\/ec.europa.eu\/digital-single-market\/en\/news\/ethics-guidelinestrustworthy-ai"},{"key":"e_1_3_2_2_25_1","volume-title":"Microsoft ai principles. 2019","year":"2019","unstructured":"\u201c Microsoft ai principles. 2019 .\u201d 2019 . [Online]. Available: https:\/\/www.microsoft. com\/en-us\/ai\/our-approach-to-ai \u201c Microsoft ai principles. 2019.\u201d 2019. [Online]. Available: https:\/\/www.microsoft. com\/en-us\/ai\/our-approach-to-ai"},{"key":"e_1_3_2_2_26_1","volume-title":"Ai fairness 360 : An extensible toolkit for detecting, understanding, and mitigating unwanted algorithmic bias","year":"2018","unstructured":"\u201c Ai fairness 360 : An extensible toolkit for detecting, understanding, and mitigating unwanted algorithmic bias ,\u201d 10 2018 . [Online]. Available: https: \/\/github.com\/IBM\/AIF360 \u201c Ai fairness 360 : An extensible toolkit for detecting, understanding, and mitigating unwanted algorithmic bias ,\u201d 10 2018. [Online]. Available: https: \/\/github.com\/IBM\/AIF360"},{"key":"e_1_3_2_2_27_1","volume-title":"Fate: Fairness, accountability, transparency, and ethics in ai","year":"2018","unstructured":"\u201c Fate: Fairness, accountability, transparency, and ethics in ai ,\u201d 2018 . [Online]. Available: https:\/\/www.microsoft.com\/en-us\/research\/group\/fate\/ \u201cFate: Fairness, accountability, transparency, and ethics in ai,\u201d 2018. [Online]. Available: https:\/\/www.microsoft.com\/en-us\/research\/group\/fate\/"},{"key":"e_1_3_2_2_28_1","volume-title":"Facebook says it has a tool to detect bias in its artificial intelligence","year":"2018","unstructured":"\u201c Facebook says it has a tool to detect bias in its artificial intelligence ,\u201d 2018 . [Online]. Available: https:\/\/qz.com\/1268520\/facebook-says-it-has-a-toolto-detect-bias-in-its-artificial-intelligence\/ \u201c Facebook says it has a tool to detect bias in its artificial intelligence ,\u201d 2018. [Online]. Available: https:\/\/qz.com\/1268520\/facebook-says-it-has-a-toolto-detect-bias-in-its-artificial-intelligence\/"},{"key":"e_1_3_2_2_29_1","unstructured":"F. Tramer V. Atlidakis R. Geambasu D. Hsu J.-P. Hubaux M. Humbert A. Juels and H. Lin \u201cFairtest: Discovering unwarranted associations in data-driven applications \u201d EuroS&P17 Apr.  F. Tramer V. Atlidakis R. Geambasu D. Hsu J.-P. Hubaux M. Humbert A. Juels and H. Lin \u201cFairtest: Discovering unwarranted associations in data-driven applications \u201d EuroS&P17 Apr."},{"key":"e_1_3_2_2_30_1"},{"key":"e_1_3_2_2_31_1"},{"key":"e_1_3_2_2_32_1","first-page":"2718","volume-title":"Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence, ser. IJCAI'16","author":"Zhang L.","year":"2016","unstructured":"L. Zhang , Y. Wu , and X. Wu , \u201c Situation testing-based discrimination discovery: A causal inference approach ,\u201d in Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence, ser. IJCAI'16 . AAAI Press , 2016 , p. 2718 - 2724 . L. Zhang, Y. Wu, and X. Wu, \u201c Situation testing-based discrimination discovery: A causal inference approach ,\u201d in Proceedings of the Twenty-Fifth International Joint Conference on Artificial Intelligence, ser. IJCAI'16. AAAI Press, 2016, p. 2718-2724."},{"key":"e_1_3_2_2_33_1","doi-asserted-by":"publisher","DOI":"10.1007\/s10115-011-0463-8"},{"key":"e_1_3_2_2_34_1","first-page":"3992","volume-title":"Advances in Neural Information Processing Systems 30","author":"Calmon F.","year":"2017","unstructured":"F. Calmon , D. Wei , B. Vinzamuri , K. Natesan Ramamurthy , and K. R. Varshney , \u201c Optimized pre-processing for discrimination prevention ,\u201d in Advances in Neural Information Processing Systems 30 , I. Guyon, U. V. Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett, Eds. Curran Associates, Inc. , 2017 , pp. 3992 - 4001 . [Online]. Available: http:\/\/papers.nips.cc\/paper\/6988-optimizedpre-processing-for-discrimination-prevention.pdf F. Calmon, D. Wei, B. Vinzamuri, K. Natesan Ramamurthy, and K. R. Varshney, \u201c Optimized pre-processing for discrimination prevention ,\u201d in Advances in Neural Information Processing Systems 30, I. Guyon, U. V. Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett, Eds. Curran Associates, Inc., 2017, pp. 3992-4001. [Online]. Available: http:\/\/papers.nips.cc\/paper\/6988-optimizedpre-processing-for-discrimination-prevention.pdf"},{"key":"e_1_3_2_2_35_1","doi-asserted-by":"publisher","DOI":"10.1145\/3278721.3278779"},{"key":"e_1_3_2_2_36_1","doi-asserted-by":"publisher","DOI":"10.1007\/978-3-642-33486-3_3"},{"key":"e_1_3_2_2_37_1","doi-asserted-by":"publisher","DOI":"10.1016\/j.ins.2017.09.064"},{"key":"e_1_3_2_2_38_1","volume-title":"On fairness and calibration","author":"Pleiss G.","year":"2017","unstructured":"G. Pleiss , M. Raghavan , F. Wu , J. Kleinberg , and K. Q. Weinberger , \u201c On fairness and calibration ,\u201d 2017 . G. Pleiss, M. Raghavan, F. Wu, J. Kleinberg, and K. Q. Weinberger, \u201c On fairness and calibration ,\u201d 2017."},{"key":"e_1_3_2_2_39_1","volume-title":"Equality of opportunity in supervised learning","author":"Hardt M.","year":"2016","unstructured":"M. Hardt , E. Price , and N. Srebro , \u201c Equality of opportunity in supervised learning ,\u201d 2016 . M. Hardt, E. Price, and N. Srebro, \u201c Equality of opportunity in supervised learning ,\u201d 2016."},{"key":"e_1_3_2_2_40_1","volume-title":"Bias or systematic error (validity )","author":"Martin J.","year":"2010","unstructured":"J. Martin , \u201c Bias or systematic error (validity ) ,\u201d 2010 . [Online]. Available: https:\/\/www.ctspedia.org\/do\/view\/CTSpedia\/BiasDefinition J. Martin, \u201c Bias or systematic error (validity ),\u201d 2010. [Online]. Available: https:\/\/www.ctspedia.org\/do\/view\/CTSpedia\/BiasDefinition"},{"key":"e_1_3_2_2_41_1","volume-title":"Ai fairness 360 : An extensible toolkit for detecting, understanding, and mitigating unwanted algorithmic bias","author":"Bellamy R.","year":"2018","unstructured":"R. Bellamy , K. Dey , M. Hind , S. C. Hofman , S. Houde , K. Kannan , P. Lohia , J. Martino , S. Mehta , A. Mojsilovic , S. Nagar , K. Natesan Ramamurthy , J. Richards , D. Saha , P. Sattigeri , M. Singh , R. Kush , and Y. Zhang , \u201c Ai fairness 360 : An extensible toolkit for detecting, understanding, and mitigating unwanted algorithmic bias ,\u201d 10 2018 . R. Bellamy, K. Dey, M. Hind, S. C. Hofman, S. Houde, K. Kannan, P. Lohia, J. Martino, S. Mehta, A. Mojsilovic, S. Nagar, K. Natesan Ramamurthy, J. Richards, D. Saha, P. Sattigeri, M. Singh, R. Kush, and Y. Zhang, \u201c Ai fairness 360 : An extensible toolkit for detecting, understanding, and mitigating unwanted algorithmic bias ,\u201d 10 2018."},{"key":"e_1_3_2_2_42_1","doi-asserted-by":"publisher","DOI":"10.1145\/3097983.3098095"},{"key":"e_1_3_2_2_43_1","volume-title":"Certifying and removing disparate impact","author":"Feldman M.","year":"2014","unstructured":"M. Feldman , S. Friedler , J. Moeller , C. Scheidegger , and S. Venkatasubramanian , \u201c Certifying and removing disparate impact ,\u201d 2014 . M. Feldman, S. Friedler, J. Moeller, C. Scheidegger, and S. Venkatasubramanian, \u201c Certifying and removing disparate impact ,\u201d 2014."},{"key":"e_1_3_2_2_44_1","volume-title":"Fair prediction with disparate impact: A study of bias in recidivism prediction instruments","author":"Chouldechova A.","year":"2016","unstructured":"A. Chouldechova , \u201c Fair prediction with disparate impact: A study of bias in recidivism prediction instruments ,\u201d 2016 . A. Chouldechova, \u201c Fair prediction with disparate impact: A study of bias in recidivism prediction instruments ,\u201d 2016."},{"key":"e_1_3_2_2_45_1","volume-title":"Inherent trade-ofs in the fair determination of risk scores","author":"Kleinberg J.","year":"2016","unstructured":"J. Kleinberg , S. Mullainathan , and M. Raghavan , \u201c Inherent trade-ofs in the fair determination of risk scores ,\u201d 2016 . J. Kleinberg, S. Mullainathan, and M. Raghavan, \u201c Inherent trade-ofs in the fair determination of risk scores ,\u201d 2016."},{"key":"e_1_3_2_2_46_1","doi-asserted-by":"publisher","DOI":"10.1145\/3306618.3314234"},{"key":"e_1_3_2_2_47_1","volume-title":"Software engineering for fairness: A case study with hyperparameter optimization","author":"Chakraborty J.","year":"2019","unstructured":"J. Chakraborty , T. Xia , F. M. Fahid , and T. Menzies , \u201c Software engineering for fairness: A case study with hyperparameter optimization ,\u201d 2019 . J. Chakraborty, T. Xia, F. M. Fahid, and T. Menzies, \u201c Software engineering for fairness: A case study with hyperparameter optimization ,\u201d 2019."},{"key":"e_1_3_2_2_48_1","volume-title":"Uci:adult data set","year":"1994","unstructured":"\u201c Uci:adult data set ,\u201d 1994 . [Online]. Available: http:\/\/mlr.cs.umass.edu\/ml\/ datasets\/Adult \u201c Uci:adult data set ,\u201d 1994. [Online]. Available: http:\/\/mlr.cs.umass.edu\/ml\/ datasets\/Adult"},{"key":"e_1_3_2_2_49_1","volume-title":"propublica\/compas-analysis","year":"2015","unstructured":"\u201c propublica\/compas-analysis ,\u201d 2015 . [Online]. Available: https:\/\/github.com\/ propublica\/compas-analysis \u201cpropublica\/compas-analysis,\u201d 2015. [Online]. Available: https:\/\/github.com\/ propublica\/compas-analysis"},{"key":"e_1_3_2_2_50_1","volume-title":"Uci:statlog (german credit data) data set","year":"2000","unstructured":"\u201c Uci:statlog (german credit data) data set ,\u201d 2000 . [Online]. Available: https: \/\/archive.ics.uci.edu\/ml\/datasets\/Statlog+(German+Credit+Data) \u201c Uci:statlog (german credit data) data set ,\u201d 2000. [Online]. Available: https: \/\/archive.ics.uci.edu\/ml\/datasets\/Statlog+(German+Credit+Data)"},{"key":"e_1_3_2_2_51_1","volume-title":"Uci:default of credit card clients data set","year":"2016","unstructured":"\u201c Uci:default of credit card clients data set ,\u201d 2016 . [Online]. Available: https:\/\/archive.ics.uci.edu\/ml\/datasets\/default+of+credit+card+clients \u201c Uci:default of credit card clients data set ,\u201d 2016. [Online]. Available: https:\/\/archive.ics.uci.edu\/ml\/datasets\/default+of+credit+card+clients"},{"key":"e_1_3_2_2_52_1","volume-title":"Uci:heart disease data set","year":"2001","unstructured":"\u201c Uci:heart disease data set ,\u201d 2001 . [Online]. Available: https:\/\/archive.ics.uci.edu\/ ml\/datasets\/Heart+Disease \u201c Uci:heart disease data set ,\u201d 2001. [Online]. Available: https:\/\/archive.ics.uci.edu\/ ml\/datasets\/Heart+Disease"},{"key":"e_1_3_2_2_53_1","unstructured":"\u201cFairware 2018 :international workshop on software fairness.\u201d [Online]. Available: http:\/\/fairware.cs.umass.edu\/  \u201cFairware 2018 :international workshop on software fairness.\u201d [Online]. Available: http:\/\/fairware.cs.umass.edu\/"},{"key":"e_1_3_2_2_54_1","volume-title":"Amazon just showed us that 'unbiased' algorithms can be inadvertently racist","unstructured":"\u201c Amazon just showed us that 'unbiased' algorithms can be inadvertently racist .\u201d [Online]. Available: https:\/\/www.businessinsider. com\/how-algorithms-can-beracist-2016-4 \u201c Amazon just showed us that 'unbiased' algorithms can be inadvertently racist.\u201d [Online]. Available: https:\/\/www.businessinsider. com\/how-algorithms-can-beracist-2016-4"},{"key":"e_1_3_2_2_55_1","volume-title":"Why split data in the ratio 70 :30?","year":"2012","unstructured":"\u201c Why split data in the ratio 70 :30? \u201d 2012 . [Online]. Available: http:\/\/informationgain.blogspot.com\/ \u201c Why split data in the ratio 70 :30?\u201d 2012. [Online]. Available: http:\/\/informationgain.blogspot.com\/"},{"key":"e_1_3_2_2_56_1","first-page":"1","volume-title":"TSE","author":"Nair V.","year":"2018","unstructured":"V. Nair , Z. Yu , T. Menzies , N. Siegmund , and S. Apel , \u201c Finding faster configurations using flash , \u201d TSE , pp. 1 - 1 , 2018 . V. Nair, Z. Yu, T. Menzies, N. Siegmund, and S. Apel, \u201c Finding faster configurations using flash, \u201d TSE, pp. 1-1, 2018."},{"key":"e_1_3_2_2_57_1","doi-asserted-by":"publisher","DOI":"10.1145\/3106237.3106238"},{"key":"e_1_3_2_2_58_1","doi-asserted-by":"publisher","DOI":"10.1023\/A:1008202821328"},{"key":"e_1_3_2_2_59_1","doi-asserted-by":"publisher","DOI":"10.1109\/4235.996017"},{"key":"e_1_3_2_2_60_1"},{"key":"e_1_3_2_2_61_1","first-page":"4765","volume-title":"Advances in Neural Information Processing Systems 30","author":"Lundberg S. M.","year":"2017","unstructured":"S. M. Lundberg and S.-I. Lee , \u201c A unified approach to interpreting model predictions ,\u201d in Advances in Neural Information Processing Systems 30 , I. Guyon, U. V. Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett, Eds. Curran Associates, Inc. , 2017 , pp. 4765 - 4774 . [Online]. Available: http:\/\/papers.nips.cc\/paper\/7062-a-unified-approach-to-interpretingmodel-predictions.pdf S. M. Lundberg and S.-I. Lee, \u201c A unified approach to interpreting model predictions,\u201d in Advances in Neural Information Processing Systems 30, I. Guyon, U. V. Luxburg, S. Bengio, H. Wallach, R. Fergus, S. Vishwanathan, and R. Garnett, Eds. Curran Associates, Inc., 2017, pp. 4765-4774. [Online]. Available: http:\/\/papers.nips.cc\/paper\/7062-a-unified-approach-to-interpretingmodel-predictions.pdf"}],"event":{"name":"ESEC\/FSE '20: 28th ACM Joint European Software Engineering Conference and Symposium on the Foundations of Software Engineering","location":"Virtual Event USA","acronym":"ESEC\/FSE '20","sponsor":["SIGSOFT ACM Special Interest Group on Software Engineering"]},"container-title":["Proceedings of the 28th ACM Joint Meeting on European Software Engineering Conference and Symposium on the Foundations of Software Engineering"],"original-title":[],"link":[{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3368089.3409697","content-type":"unspecified","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/dl.acm.org\/doi\/pdf\/10.1145\/3368089.3409697","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,6,17]],"date-time":"2025-06-17T23:44:40Z","timestamp":1750203880000},"score":1,"resource":{"primary":{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3368089.3409697"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2020,11,8]]},"references-count":61,"alternative-id":["10.1145\/3368089.3409697","10.1145\/3368089"],"URL":"https:\/\/doi.org\/10.1145\/3368089.3409697","relation":{},"subject":[],"published":{"date-parts":[[2020,11,8]]},"assertion":[{"value":"2020-11-08","order":2,"name":"published","label":"Published","group":{"name":"publication_history","label":"Publication History"}}]}}