{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,4,13]],"date-time":"2026-04-13T19:05:50Z","timestamp":1776107150890,"version":"3.50.1"},"reference-count":76,"publisher":"Privacy Enhancing Technologies Symposium Advisory Board","issue":"2","license":[{"start":{"date-parts":[[2022,3,3]],"date-time":"2022-03-03T00:00:00Z","timestamp":1646265600000},"content-version":"unspecified","delay-in-days":0,"URL":"http:\/\/creativecommons.org\/licenses\/by-nc-nd\/3.0"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":[],"published-print":{"date-parts":[[2022,4,1]]},"abstract":"<jats:title>Abstract<\/jats:title><jats:p>Organizations often collect private data and release aggregate statistics for the public\u2019s benefit. If no steps toward preserving privacy are taken, adversaries may use released statistics to deduce unauthorized information about the individuals described in the private dataset. Differentially private algorithms address this challenge by slightly perturbing underlying statistics with noise, thereby mathematically limiting the amount of information that may be deduced from each data release. Properly calibrating these algorithms\u2014and in turn the disclosure risk for people described in the dataset\u2014requires a data curator to choose a value for a privacy budget parameter,<jats:italic>\u025b<\/jats:italic>. However, there is little formal guidance for choosing<jats:italic>\u025b<\/jats:italic>, a task that requires reasoning about the probabilistic privacy\u2013utility tradeoff. Furthermore, choosing<jats:italic>\u025b<\/jats:italic>in the context of statistical inference requires reasoning about accuracy trade-offs in the presence of both measurement error and differential privacy (DP) noise.<\/jats:p><jats:p>We present<jats:bold>Vi<\/jats:bold>sualizing<jats:bold>P<\/jats:bold>rivacy (ViP), an interactive interface that visualizes relationships between<jats:italic>\u025b<\/jats:italic>, accuracy, and disclosure risk to support setting and splitting<jats:italic>\u025b<\/jats:italic>among queries. As a user adjusts<jats:italic>\u025b<\/jats:italic>, ViP dynamically updates visualizations depicting expected accuracy and risk. ViP also has an inference setting, allowing a user to reason about the impact of DP noise on statistical inferences. Finally, we present results of a study where 16 research practitioners with little to no DP background completed a set of tasks related to setting<jats:italic>\u025b<\/jats:italic>using both ViP and a control. We find that ViP helps participants more correctly answer questions related to judging the probability of where a DP-noised release is likely to fall and comparing between DP-noised and non-private confidence intervals.<\/jats:p>","DOI":"10.2478\/popets-2022-0058","type":"journal-article","created":{"date-parts":[[2022,3,5]],"date-time":"2022-03-05T04:27:04Z","timestamp":1646454424000},"page":"601-618","source":"Crossref","is-referenced-by-count":40,"title":["Visualizing Privacy-Utility Trade-Offs in Differentially Private Data Releases"],"prefix":"10.56553","volume":"2022","author":[{"given":"Priyanka","family":"Nanayakkara","sequence":"first","affiliation":[{"name":"Northwestern University"}]},{"given":"Johes","family":"Bater","sequence":"additional","affiliation":[{"name":"Duke University"}]},{"given":"Xi","family":"He","sequence":"additional","affiliation":[{"name":"University of Waterloo"}]},{"given":"Jessica","family":"Hullman","sequence":"additional","affiliation":[{"name":"Northwestern University"}]},{"given":"Jennie","family":"Rogers","sequence":"additional","affiliation":[{"name":"Northwestern University"}]}],"member":"35752","published-online":{"date-parts":[[2022,3,3]]},"reference":[{"key":"2022060207204043852_j_popets-2022-0058_ref_001","doi-asserted-by":"crossref","unstructured":"[1] Abowd, J. M. (2018). The US Census Bureau adopts differential privacy. In Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining (pp. 2867\u20132867).","DOI":"10.1145\/3219819.3226070"},{"key":"2022060207204043852_j_popets-2022-0058_ref_002","unstructured":"[2] Aktay, A., Bavadekar, S., Cossoul, G., Davis, J., Desfontaines, D., Fabrikant, A., . . . others (2020). Google COVID-19 Community Mobility Reports: anonymization process description (version 1.1). arXiv preprint arXiv:2004.04145."},{"key":"2022060207204043852_j_popets-2022-0058_ref_003","doi-asserted-by":"crossref","unstructured":"[3] Almasi, M. M., Siddiqui, T. R., Mohammed, N., & Hemmati, H. (2016). The risk-utility tradeoff for data privacy models. In 2016 8th IFIP International Conference on New Technologies, Mobility and Security (NTMS) (pp. 1\u20135).10.1109\/NTMS.2016.7792481","DOI":"10.1109\/NTMS.2016.7792481"},{"key":"2022060207204043852_j_popets-2022-0058_ref_004","unstructured":"[4] Assistive AI Makes Replying Easier. (2020). Retrieved from https:\/\/www.microsoft.com\/en-us\/research\/group\/msai\/articles\/assistive-ai-makes-replying-easier-2\/"},{"key":"2022060207204043852_j_popets-2022-0058_ref_005","unstructured":"[5] Bavadekar, S., Boulanger, A., Davis, J., Desfontaines, D., Gabrilovich, E., Gadepalli, K., . . . others (2021). Google COVID-19 Vaccination Search Insights: Anonymization Process Description. arXiv preprint arXiv:2107.01179."},{"key":"2022060207204043852_j_popets-2022-0058_ref_006","unstructured":"[6] Bavadekar, S., Dai, A., Davis, J., Desfontaines, D., Eckstein, I., Everett, K., . . . others (2020). Google COVID-19 Search Trends Symptoms Dataset: Anonymization Process Description (version 1.0). arXiv preprint arXiv:2009.01265."},{"key":"2022060207204043852_j_popets-2022-0058_ref_007","unstructured":"[7] Biswas, S., Dong, Y., Kamath, G., & Ullman, J. (2020). Coinpress: Practical private mean and covariance estimation. arXiv preprint arXiv:2006.06618."},{"key":"2022060207204043852_j_popets-2022-0058_ref_008","doi-asserted-by":"crossref","unstructured":"[8] Bittner, D. M., Brito, A. E., Ghassemi, M., Rane, S., Sarwate, A. D., & Wright, R. N. (2020). Understanding Privacy-Utility Tradeoffs in Differentially Private Online Active Learning. Journal of Privacy and Confidentiality, 10(2).10.29012\/jpc.720","DOI":"10.29012\/jpc.720"},{"key":"2022060207204043852_j_popets-2022-0058_ref_009","unstructured":"[9] Bostock, M. (2012). D3.js - Data-Driven Documents. Retrieved from http:\/\/d3js.org\/"},{"key":"2022060207204043852_j_popets-2022-0058_ref_010","unstructured":"[10] Brawner, T., & Honaker, J. (2018). Bootstrap inference and differential privacy: Standard errors for free. Unpublished Manuscript."},{"key":"2022060207204043852_j_popets-2022-0058_ref_011","unstructured":"[11] Chance, B., Garfield, J., & delMas, R. (2000). Developing Simulation Activities To Improve Students\u2019 Statistical Reasoning."},{"key":"2022060207204043852_j_popets-2022-0058_ref_012","unstructured":"[12] chroma.js. (n.d.). Retrieved from https:\/\/gka.github.io\/chroma.js\/"},{"key":"2022060207204043852_j_popets-2022-0058_ref_013","unstructured":"[13] Cumming, G., & Thomason, N. (1998). Statplay: Multimedia for statistical understanding, in Pereira-Mendoza (ed. In Proceedings of the Fifth International Conference on Teaching Statistics, ISI."},{"key":"2022060207204043852_j_popets-2022-0058_ref_014","doi-asserted-by":"crossref","unstructured":"[14] Cummings, R., Kaptchuk, G., & Redmiles, E. M. (2021). \u201cI need a better description\u201d: An Investigation Into User Expectations For Differential Privacy. ACM CCS.10.1145\/3460120.3485252","DOI":"10.1145\/3460120.3485252"},{"key":"2022060207204043852_j_popets-2022-0058_ref_015","doi-asserted-by":"crossref","unstructured":"[15] delMas, R. C., Garfield, J., & Chance, B. (1999). A model of classroom research in action: Developing simulation activities to improve students\u2019 statistical reasoning. Journal of Statistics Education, 7(3).","DOI":"10.1080\/10691898.1999.12131279"},{"key":"2022060207204043852_j_popets-2022-0058_ref_016","unstructured":"[16] Desfontaines, D. (2020). Lowering the cost of anonymization (Unpublished doctoral dissertation). ETH Zurich."},{"key":"2022060207204043852_j_popets-2022-0058_ref_017","unstructured":"[17] Du, W., Foot, C., Moniot, M., Bray, A., & Groce, A. (2020). Differentially private confidence intervals. arXiv preprint arXiv:2001.02285."},{"key":"2022060207204043852_j_popets-2022-0058_ref_018","doi-asserted-by":"crossref","unstructured":"[18] Dwork, C., Kohli, N., & Mulligan, D. (2019). Differential Privacy in Practice: Expose Your Epsilons! Journal of Privacy and Confidentiality, 9(2).10.29012\/jpc.689","DOI":"10.29012\/jpc.689"},{"key":"2022060207204043852_j_popets-2022-0058_ref_019","doi-asserted-by":"crossref","unstructured":"[19] Dwork, C., McSherry, F., Nissim, K., & Smith, A. (2006). Calibrating noise to sensitivity in private data analysis. In Theory of Cryptography Conference (pp. 265\u2013284).10.1007\/11681878_14","DOI":"10.1007\/11681878_14"},{"key":"2022060207204043852_j_popets-2022-0058_ref_020","doi-asserted-by":"crossref","unstructured":"[20] Dwork, C., & Roth, A. (2014). The Algorithmic Foundations of Differential Privacy. Found. Trends Theor. Comput. Sci..","DOI":"10.1561\/9781601988195"},{"key":"2022060207204043852_j_popets-2022-0058_ref_021","unstructured":"[21] Enabling developers and organizations to use differential privacy. (2019). Retrieved from https:\/\/developers.googleblog.com\/2019\/09\/enabling-developers-and-organizations.html"},{"key":"2022060207204043852_j_popets-2022-0058_ref_022","unstructured":"[22] Evans, G., King, G., Schwenzfeier, M., & Thakurta, A. (2020). Statistically valid inferences from privacy protected data. URL: GaryKing.org\/dp."},{"key":"2022060207204043852_j_popets-2022-0058_ref_023","doi-asserted-by":"crossref","unstructured":"[23] Fernandes, M., Walls, L., Munson, S., Hullman, J., & Kay, M. (2018). Uncertainty displays using quantile dotplots or cdfs improve transit decision-making. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (pp. 1\u201312).10.1145\/3173574.3173718","DOI":"10.1145\/3173574.3173718"},{"key":"2022060207204043852_j_popets-2022-0058_ref_024","unstructured":"[24] Ferrando, C., Wang, S., & Sheldon, D. (2020). General-Purpose Differentially-Private Confidence Intervals. arXiv preprint arXiv:2006.07749."},{"key":"2022060207204043852_j_popets-2022-0058_ref_025","unstructured":"[25] Gaboardi, M., Hay, M., & Vadhan, S. (2020). A programming framework for opendp. Manuscript, May."},{"key":"2022060207204043852_j_popets-2022-0058_ref_026","unstructured":"[26] Gaboardi, M., Honaker, J., King, G., Murtagh, J., Nissim, K., Ullman, J., & Vadhan, S. (2018). PSI (\u03a8): a Private data Sharing Interface."},{"key":"2022060207204043852_j_popets-2022-0058_ref_027","doi-asserted-by":"crossref","unstructured":"[27] Ganta, S. R., Kasiviswanathan, S. P., & Smith, A. (2008). Composition attacks and auxiliary information in data privacy. In Proceedings of the 14th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (pp. 265\u2013273).10.1145\/1401890.1401926","DOI":"10.1145\/1401890.1401926"},{"key":"2022060207204043852_j_popets-2022-0058_ref_028","doi-asserted-by":"crossref","unstructured":"[28] Ge, C., He, X., Ilyas, I. F., & Machanavajjhala, A. (2019). Apex: Accuracy-aware differentially private data exploration. In Proceedings of the 2019 International Conference on Management of Data (pp. 177\u2013194).10.1145\/3299869.3300092","DOI":"10.1145\/3299869.3300092"},{"key":"2022060207204043852_j_popets-2022-0058_ref_029","doi-asserted-by":"crossref","unstructured":"[29] Gigerenzer, G., & Hoffrage, U. (1995). How to improve bayesian reasoning without instruction: frequency formats. Psychological Review, 102(4), 684.10.1037\/0033-295X.102.4.684","DOI":"10.1037\/0033-295X.102.4.684"},{"key":"2022060207204043852_j_popets-2022-0058_ref_030","doi-asserted-by":"crossref","unstructured":"[30] Greig, D. M., Porteous, B. T., & Seheult, A. H. (1989). Exact maximum a posteriori estimation for binary images. Journal of the Royal Statistical Society: Series B (Methodological), 51(2), 271\u2013279.","DOI":"10.1111\/j.2517-6161.1989.tb01764.x"},{"key":"2022060207204043852_j_popets-2022-0058_ref_031","unstructured":"[31] Haeberlen, A., Pierce, B. C., & Narayan, A. (2011). Differential Privacy Under Fire. In USENIX Security Symposium (Vol. 33)."},{"key":"2022060207204043852_j_popets-2022-0058_ref_032","unstructured":"[32] Hawes, M. (2020). Differential Privacy and the 2020 Decennial Census. Webinar."},{"key":"2022060207204043852_j_popets-2022-0058_ref_033","doi-asserted-by":"crossref","unstructured":"[33] Hay, M., Machanavajjhala, A., Miklau, G., Chen, Y., Zhang, D., & Bissias, G. (2016). Exploring privacy-accuracy trade-offs using dpcomp. In Proceedings of the 2016 International Conference on Management of Data (pp. 2101\u20132104).10.1145\/2882903.2899387","DOI":"10.1145\/2882903.2899387"},{"key":"2022060207204043852_j_popets-2022-0058_ref_034","unstructured":"[34] Herda\u011fdelen, A., Dow, A., State, B., Mohassel, P., & Pompe, A. (2020). Protecting privacy in Facebook mobility data during the COVID-19 response. Retrieved from https:\/\/research.fb.com\/blog\/2020\/06\/protecting-privacy-in-facebook-mobility-data-during-the-covid-19-response\/"},{"key":"2022060207204043852_j_popets-2022-0058_ref_035","doi-asserted-by":"crossref","unstructured":"[35] Hofman, J. M., Goldstein, D. G., & Hullman, J. (2020). How visualizing inferential uncertainty can mislead readers about treatment effects in scientific results. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (pp. 1\u201312).10.1145\/3313831.3376454","DOI":"10.1145\/3313831.3376454"},{"key":"2022060207204043852_j_popets-2022-0058_ref_036","unstructured":"[36] Holohan, N., Braghin, S., Mac Aonghusa, P., & Levacher, K. (2019). Diffprivlib: the IBM differential privacy library. arXiv preprint arXiv:1907.02444."},{"key":"2022060207204043852_j_popets-2022-0058_ref_037","doi-asserted-by":"crossref","unstructured":"[37] Hsu, J., Gaboardi, M., Haeberlen, A., Khanna, S., Narayan, A., Pierce, B. C., & Roth, A. (2014). Differential privacy: An economic method for choosing epsilon. In 2014 IEEE 27th Computer Security Foundations Symposium (pp. 398\u2013410).10.1109\/CSF.2014.35","DOI":"10.1109\/CSF.2014.35"},{"key":"2022060207204043852_j_popets-2022-0058_ref_038","doi-asserted-by":"crossref","unstructured":"[38] Hullman, J., Qiao, X., Correll, M., Kale, A., & Kay, M. (2018). In pursuit of error: A survey of uncertainty visualization evaluation. IEEE Transactions on Visualization and Computer Graphics, 25(1), 903\u2013913.10.1109\/TVCG.2018.286488930207956","DOI":"10.1109\/TVCG.2018.2864889"},{"key":"2022060207204043852_j_popets-2022-0058_ref_039","doi-asserted-by":"crossref","unstructured":"[39] Hullman, J., Resnick, P., & Adar, E. (2015). Hypothetical outcome plots outperform error bars and violin plots for inferences about reliability of variable ordering. PloS One, 10(11), e0142444.10.1371\/journal.pone.0142444464669826571487","DOI":"10.1371\/journal.pone.0142444"},{"key":"2022060207204043852_j_popets-2022-0058_ref_040","doi-asserted-by":"crossref","unstructured":"[40] Jarvenpaa, S. L. (1990). Graphic displays in decision making\u2014the visual salience effect. Journal of Behavioral Decision Making, 3(4), 247\u2013262.10.1002\/bdm.3960030403","DOI":"10.1002\/bdm.3960030403"},{"key":"2022060207204043852_j_popets-2022-0058_ref_041","doi-asserted-by":"crossref","unstructured":"[41] Kale, A., Kay, M., & Hullman, J. (2020). Visual reasoning strategies for effect size judgments and decisions. IEEE Transactions on Visualization and Computer Graphics.","DOI":"10.1109\/TVCG.2020.3030335"},{"key":"2022060207204043852_j_popets-2022-0058_ref_042","doi-asserted-by":"crossref","unstructured":"[42] Kale, A., Nguyen, F., Kay, M., & Hullman, J. (2018). Hypothetical outcome plots help untrained observers judge trends in ambiguous data. IEEE Transactions on Visualization and Computer Graphics, 25(1), 892\u2013902.10.1109\/TVCG.2018.286490930136961","DOI":"10.1109\/TVCG.2018.2864909"},{"key":"2022060207204043852_j_popets-2022-0058_ref_043","unstructured":"[43] Karwa, V., & Vadhan, S. (2017). Finite sample differentially private confidence intervals. arXiv preprint arXiv:1711.03908."},{"key":"2022060207204043852_j_popets-2022-0058_ref_044","doi-asserted-by":"crossref","unstructured":"[44] Kasiviswanathan, S. P., & Smith, A. (2014). On the \u2019semantics\u2019 of differential privacy: A bayesian formulation. Journal of Privacy and Confidentiality, 6(1).10.29012\/jpc.v6i1.634","DOI":"10.29012\/jpc.v6i1.634"},{"key":"2022060207204043852_j_popets-2022-0058_ref_045","doi-asserted-by":"crossref","unstructured":"[45] Kay, M., Kola, T., Hullman, J. R., & Munson, S. A. (2016). When (ish) is my bus? user-centered visualizations of uncertainty in everyday, mobile predictive systems. In Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems (pp. 5092\u20135103).","DOI":"10.1145\/2858036.2858558"},{"key":"2022060207204043852_j_popets-2022-0058_ref_046","doi-asserted-by":"crossref","unstructured":"[46] Kho, A. N., Hynes, D. M. D., Goel, S., Solomonides, A. E., Price, R., Hota, B., . . . Others (2014). CAPriCORN: Chicago Area Patient-Centered Outcomes Research Network. Journal of the American Medical Informatics Association, 21(4), 607\u2013611. Retrieved from http:\/\/jamia.oxfordjournals.org\/content\/21\/4\/607.short10.1136\/amiajnl-2014-002827407829824821736","DOI":"10.1136\/amiajnl-2014-002827"},{"key":"2022060207204043852_j_popets-2022-0058_ref_047","doi-asserted-by":"crossref","unstructured":"[47] Kifer, D., & Machanavajjhala, A. (2011). No free lunch in data privacy. In Proceedings of the 2011 ACM SIGMOD International Conference on Management of data (pp. 193\u2013204).10.1145\/1989323.1989345","DOI":"10.1145\/1989323.1989345"},{"key":"2022060207204043852_j_popets-2022-0058_ref_048","doi-asserted-by":"crossref","unstructured":"[48] Kifer, D., & Machanavajjhala, A. (2012). A rigorous and customizable framework for privacy. In Proceedings of the 31st ACM SIGMOD-SIGACT-SIGAI symposium on Principles of Database Systems (pp. 77\u201388).10.1145\/2213556.2213571","DOI":"10.1145\/2213556.2213571"},{"key":"2022060207204043852_j_popets-2022-0058_ref_049","doi-asserted-by":"crossref","unstructured":"[49] Lee, J., & Clifton, C. (2011). How Much is Enough? Choosing \u025b for Differential Privacy. In International Conference on Information Security (pp. 325\u2013340).10.1007\/978-3-642-24861-0_22","DOI":"10.1007\/978-3-642-24861-0_22"},{"key":"2022060207204043852_j_popets-2022-0058_ref_050","doi-asserted-by":"crossref","unstructured":"[50] Li, C., Miklau, G., Hay, M., McGregor, A., & Rastogi, V. (2015). The matrix mechanism: optimizing linear counting queries under differential privacy. The VLDB journal, 24(6), 757\u2013781.10.1007\/s00778-015-0398-x","DOI":"10.1007\/s00778-015-0398-x"},{"key":"2022060207204043852_j_popets-2022-0058_ref_051","doi-asserted-by":"crossref","unstructured":"[51] Liu, C., He, X., Chanyaswad, T., Wang, S., & Mittal, P. (2019). Investigating Statistical Privacy Frameworks from the Perspective of Hypothesis Testing. Proc. Priv. Enhancing Technol., 2019(3), 233\u2013254.10.2478\/popets-2019-0045","DOI":"10.2478\/popets-2019-0045"},{"key":"2022060207204043852_j_popets-2022-0058_ref_052","doi-asserted-by":"crossref","unstructured":"[52] Machanavajjhala, A., Kifer, D., Abowd, J., Gehrke, J., & Vilhuber, L. (2008). Privacy: Theory meets practice on the map. In 2008 IEEE 24th International Conference on Data Engineering (pp. 277\u2013286).10.1109\/ICDE.2008.4497436","DOI":"10.1109\/ICDE.2008.4497436"},{"key":"2022060207204043852_j_popets-2022-0058_ref_053","doi-asserted-by":"crossref","unstructured":"[53] McKenna, R., Miklau, G., Hay, M., & Machanavajjhala, A. (2018). Optimizing error of high-dimensional statistical queries under differential privacy. Proceedings of the VLDB Endowment, 11(10), 1206\u20131219.10.14778\/3231751.3231769","DOI":"10.14778\/3231751.3231769"},{"key":"2022060207204043852_j_popets-2022-0058_ref_054","unstructured":"[54] McSherry, F. D. (2009). Privacy Integrated Queries: An Extensible Platform for Privacy-preserving Data Analysis. In Proceedings of the 2009 ACM SIGMOD International Conference on Management of Data, series = SIGMOD \u201909 (pp. 19\u201330). New York, NY, USA: ACM. Retrieved from http:\/\/doi.acm.org\/10.1145\/1559845.1559850 doi: 10.1145\/1559845.155985010.1145\/1559845.1559850"},{"key":"2022060207204043852_j_popets-2022-0058_ref_055","unstructured":"[55] Messing, S., DeGregorio, C., Hillenbrand, B., King, G., Mahanti, S., Mukerjee, Z., . . . Wilkins, A. (2020). Urls-v3.pdf. In Facebook Privacy-Protected Full URLs Data Set. Harvard Dataverse. Retrieved from https:\/\/doi.org\/10.7910\/DVN\/TDOAPG\/DGSAMS doi: 10.7910\/DVN\/TDOAPG\/DGSAMS"},{"key":"2022060207204043852_j_popets-2022-0058_ref_056","doi-asserted-by":"crossref","unstructured":"[56] Mironov, I. (2012). On significance of the least significant bits for differential privacy. In Proceedings of the 2012 ACM Conference on Computer and Communications Security (pp. 650\u2013661).10.1145\/2382196.2382264","DOI":"10.1145\/2382196.2382264"},{"key":"2022060207204043852_j_popets-2022-0058_ref_057","doi-asserted-by":"crossref","unstructured":"[57] Mironov, I. (2017). R\u00e9nyi differential privacy. In 2017 IEEE 30th Computer Security Foundations Symposium (CSF) (pp. 263\u2013275).10.1109\/CSF.2017.11","DOI":"10.1109\/CSF.2017.11"},{"key":"2022060207204043852_j_popets-2022-0058_ref_058","unstructured":"[58] Morgenstern, O., & Von Neumann, J. (1953). Theory of games and economic behavior. Princeton University Press."},{"key":"2022060207204043852_j_popets-2022-0058_ref_059","unstructured":"[59] Neyman, J., & Pearson, E. S. (2020). On the use and interpretation of certain test criteria for purposes of statistical inference. Part I. University of California Press."},{"key":"2022060207204043852_j_popets-2022-0058_ref_060","doi-asserted-by":"crossref","unstructured":"[60] Nissim, K., Raskhodnikova, S., & Smith, A. (2007). Smooth sensitivity and sampling in private data analysis. In Proceedings of the Thirty-Ninth Annual ACM Symposium on Theory of Computing (pp. 75\u201384).10.1145\/1250790.1250803","DOI":"10.1145\/1250790.1250803"},{"key":"2022060207204043852_j_popets-2022-0058_ref_061","unstructured":"[61] Rivasplata, O. (2012). Subgaussian random variables: An expository note. Internet publication, PDF."},{"key":"2022060207204043852_j_popets-2022-0058_ref_062","unstructured":"[62] Rogers, R., Cardoso, A. R., Mancuhan, K., Kaura, A., Gahlawat, N., Jain, N., . . . Ahammad, P. (2020). A Members First Approach to Enabling LinkedIn\u2019s Labor Market Insights at Scale. arXiv preprint arXiv:2010.13981."},{"key":"2022060207204043852_j_popets-2022-0058_ref_063","unstructured":"[63] Savage, L. J. (1954). The foundations of statistics. Wiley."},{"key":"2022060207204043852_j_popets-2022-0058_ref_064","doi-asserted-by":"crossref","unstructured":"[64] Schwarz, C. J., & Sutherland, J. (1997). An on-line workshop using a simple capture-recapture experiment to illustrate the concepts of a sampling distribution. Journal of Statistics Education, 5(1).10.1080\/10691898.1997.11910523","DOI":"10.1080\/10691898.1997.11910523"},{"key":"2022060207204043852_j_popets-2022-0058_ref_065","doi-asserted-by":"crossref","unstructured":"[65] Shepp, L. A., & Vardi, Y. (1982). Maximum likelihood reconstruction for emission tomography. IEEE Transactions on Medical Imaging, 1(2), 113\u2013122.10.1109\/TMI.1982.430755818238264","DOI":"10.1109\/TMI.1982.4307558"},{"key":"2022060207204043852_j_popets-2022-0058_ref_066","doi-asserted-by":"crossref","unstructured":"[66] St. John, M. F., Denker, G., Laud, P., Martiny, K., & Pankova, A. (2021). Decision Support for Sharing Data Using Differential Privacy. IEEE Transactions on Visualization and Computer Graphics, 26\u201335.","DOI":"10.1109\/VizSec53666.2021.00008"},{"key":"2022060207204043852_j_popets-2022-0058_ref_067","doi-asserted-by":"crossref","unstructured":"[67] Sweeney, L. (2002). k-anonymity: A model for protecting privacy. International Journal of Uncertainty, Fuzziness and Knowledge-Based Systems, 10(05), 557\u2013570.10.1142\/S0218488502001648","DOI":"10.1142\/S0218488502001648"},{"key":"2022060207204043852_j_popets-2022-0058_ref_068","unstructured":"[68] Tableau Software. (n.d.). Color Palettes with RGB Values."},{"key":"2022060207204043852_j_popets-2022-0058_ref_069","unstructured":"[69] Tang, J., Korolova, A., Bai, X., Wang, X., & Wang, X. (2017). Privacy loss in apple\u2019s implementation of differential privacy on macos 10.12. arXiv preprint arXiv:1709.02753."},{"key":"2022060207204043852_j_popets-2022-0058_ref_070","unstructured":"[70] Thaker, P., Budiu, M., Gopalan, P., Wieder, U., & Zaharia, M. (2020). Overlook: Differentially Private Exploratory Visualization for Big Data. arXiv preprint arXiv:2006.12018."},{"key":"2022060207204043852_j_popets-2022-0058_ref_071","doi-asserted-by":"crossref","unstructured":"[71] Wasserman, L., & Zhou, S. (2010). A statistical framework for differential privacy. Journal of the American Statistical Association, 105(489), 375\u2013389.10.1198\/jasa.2009.tm08651","DOI":"10.1198\/jasa.2009.tm08651"},{"key":"2022060207204043852_j_popets-2022-0058_ref_072","doi-asserted-by":"crossref","unstructured":"[72] Wilkinson, L. (1999). Dot plots. The American Statistician, 53(3), 276\u2013281.","DOI":"10.1080\/00031305.1999.10474474"},{"key":"2022060207204043852_j_popets-2022-0058_ref_073","unstructured":"[73] Wong, R. C.-W., Fu, A. W.-C., Wang, K., & Pei, J. (2007). Minimality attack in privacy preserving data publishing. In Proceedings of the 33rd International Conference on Very Large Data Bases (pp. 543\u2013554)."},{"key":"2022060207204043852_j_popets-2022-0058_ref_074","doi-asserted-by":"crossref","unstructured":"[74] Wright, P. C., & Monk, A. F. (1991). The use of think-aloud evaluation methods in design. ACM SIGCHI Bulletin, 23(1), 55\u201357.10.1145\/122672.122685","DOI":"10.1145\/122672.122685"},{"key":"2022060207204043852_j_popets-2022-0058_ref_075","doi-asserted-by":"crossref","unstructured":"[75] Xiong, A., Wang, T., Li, N., & Jha, S. (2020). Towards Effective Differential Privacy Communication for Users\u2019 Data Sharing Decision and Comprehension. In 2020 IEEE Symposium on Security and Privacy (SP) (pp. 392\u2013410).10.1109\/SP40000.2020.00088","DOI":"10.1109\/SP40000.2020.00088"},{"key":"2022060207204043852_j_popets-2022-0058_ref_076","doi-asserted-by":"crossref","unstructured":"[76] Yang, B., Sato, I., & Nakagawa, H. (2015). Bayesian differential privacy on correlated data. In Proceedings of the 2015 ACM SIGMOD International Conference on Management of Data (pp. 747\u2013762).10.1145\/2723372.2747643","DOI":"10.1145\/2723372.2747643"}],"container-title":["Proceedings on Privacy Enhancing Technologies"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.sciendo.com\/pdf\/10.2478\/popets-2022-0058","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2023,1,28]],"date-time":"2023-01-28T12:14:28Z","timestamp":1674908068000},"score":1,"resource":{"primary":{"URL":"https:\/\/petsymposium.org\/popets\/2022\/popets-2022-0058.php"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2022,3,3]]},"references-count":76,"journal-issue":{"issue":"2","published-online":{"date-parts":[[2022,3,3]]},"published-print":{"date-parts":[[2022,4,1]]}},"alternative-id":["10.2478\/popets-2022-0058"],"URL":"https:\/\/doi.org\/10.2478\/popets-2022-0058","relation":{},"ISSN":["2299-0984"],"issn-type":[{"value":"2299-0984","type":"electronic"}],"subject":[],"published":{"date-parts":[[2022,3,3]]}}}