{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,10,12]],"date-time":"2025-10-12T02:03:49Z","timestamp":1760234629602,"version":"build-2065373602"},"reference-count":30,"publisher":"MDPI AG","issue":"6","license":[{"start":{"date-parts":[[2021,6,8]],"date-time":"2021-06-08T00:00:00Z","timestamp":1623110400000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Entropy"],"abstract":"<jats:p>Within exponential families, which may consist of multi-parameter and multivariate distributions, a variety of divergence measures, such as the Kullback\u2013Leibler divergence, the Cressie\u2013Read divergence, the R\u00e9nyi divergence, and the Hellinger metric, can be explicitly expressed in terms of the respective cumulant function and mean value function. Moreover, the same applies to related entropy and affinity measures. We compile representations scattered in the literature and present a unified approach to the derivation in exponential families. As a statistical application, we highlight their use in the construction of confidence regions in a multi-sample setup.<\/jats:p>","DOI":"10.3390\/e23060726","type":"journal-article","created":{"date-parts":[[2021,6,8]],"date-time":"2021-06-08T12:08:55Z","timestamp":1623154135000},"page":"726","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":1,"title":["On Representations of Divergence Measures and Related Quantities in Exponential Families"],"prefix":"10.3390","volume":"23","author":[{"ORCID":"https:\/\/orcid.org\/0000-0003-2691-7610","authenticated-orcid":false,"given":"Stefan","family":"Bedbur","sequence":"first","affiliation":[{"name":"Institute of Statistics, RWTH Aachen University, 52056 Aachen, Germany"}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-7739-2426","authenticated-orcid":false,"given":"Udo","family":"Kamps","sequence":"additional","affiliation":[{"name":"Institute of Statistics, RWTH Aachen University, 52056 Aachen, Germany"}]}],"member":"1968","published-online":{"date-parts":[[2021,6,8]]},"reference":[{"key":"ref_1","unstructured":"Pardo, L. (2006). Statistical Inference Based on Divergence Measures, Chapman & Hall\/CRC."},{"key":"ref_2","unstructured":"Liese, F., and Vajda, I. (1987). Convex Statistical Distances, Teubner."},{"key":"ref_3","unstructured":"Vajda, I. (1989). Theory of Statistical Inference and Information, Kluwer Academic Publishers."},{"key":"ref_4","doi-asserted-by":"crossref","unstructured":"Liese, F., and Miescke, K.J. (2008). Statistical Decision Theory: Estimation, Testing, and Selection, Springer.","DOI":"10.1007\/978-0-387-73194-0_3"},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"16","DOI":"10.1016\/j.jmva.2008.03.011","article-title":"Parametric estimation and tests through divergences and the duality technique","volume":"100","author":"Broniatowski","year":"2009","journal-title":"J. Multivar. Anal."},{"key":"ref_6","doi-asserted-by":"crossref","first-page":"77","DOI":"10.1016\/j.stamet.2016.04.002","article-title":"Homogeneity testing via weighted affinity in multiparameter exponential families","volume":"32","author":"Katzur","year":"2016","journal-title":"Stat. Methodol."},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"37","DOI":"10.1016\/S0893-9659(99)00142-1","article-title":"Shannon\u2019s entropy in exponential families: Statistical applications","volume":"13","author":"Menendez","year":"2000","journal-title":"Appl. Math. Lett."},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"1099","DOI":"10.1080\/03610929708831970","article-title":"Divergence measures between populations: Applications in the exponential family","volume":"26","author":"Morales","year":"1997","journal-title":"Commun. Statist. Theory Methods"},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"133","DOI":"10.1080\/02331880310001634647","article-title":"R\u00e9nyi statistics for testing composite hypotheses in general exponential models","volume":"38","author":"Morales","year":"2004","journal-title":"Statistics"},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"20","DOI":"10.1016\/j.jmva.2010.07.010","article-title":"Dual divergence estimators and tests: Robustness results","volume":"102","author":"Toma","year":"2011","journal-title":"J. Multivar. Anal."},{"key":"ref_11","doi-asserted-by":"crossref","first-page":"75","DOI":"10.1016\/j.jmva.2016.05.007","article-title":"Classification into Kullback\u2013Leibler balls in exponential families","volume":"150","author":"Katzur","year":"2016","journal-title":"J. Multivar. Anal."},{"key":"ref_12","doi-asserted-by":"crossref","unstructured":"Barndorff-Nielsen, O. (2014). Information and Exponential Families in Statistical Theory, Wiley.","DOI":"10.1002\/9781118445112.stat00970"},{"key":"ref_13","doi-asserted-by":"crossref","unstructured":"Brown, L.D. (1986). Fundamentals of Statistical Exponential Families, Institute of Mathematical Statistics.","DOI":"10.1214\/lnms\/1215466757"},{"key":"ref_14","doi-asserted-by":"crossref","unstructured":"Pfanzagl, J. (1994). Parametric Statistical Theory, de Gruyter.","DOI":"10.1515\/9783110889765"},{"key":"ref_15","unstructured":"Kullback, S. (1959). Information Theory and Statistics, Wiley."},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"533","DOI":"10.1093\/biomet\/42.3-4.533","article-title":"Exact forms of some invariants for distributions admitting sufficient statistics","volume":"42","author":"Huzurbazar","year":"1955","journal-title":"Biometrika"},{"key":"ref_17","doi-asserted-by":"crossref","unstructured":"Nielsen, F., and Nock, R. (2010, January 26\u201329). Entropies and cross-entropies of exponential families. Proceedings of the 2010 IEEE 17th International Conference on Image Processing, Hong Kong, China.","DOI":"10.1109\/ICIP.2010.5652054"},{"key":"ref_18","unstructured":"Johnson, D., and Sinanovic, S. (2001). Symmetrizing the Kullback\u2013Leibler distance. IEEE Trans. Inf. Theory, Available online: https:\/\/hdl.handle.net\/1911\/19969."},{"key":"ref_19","doi-asserted-by":"crossref","unstructured":"Nielsen, F. (2019). On the Jensen\u2013Shannon symmetrization of distances relying on abstract means. Entropy, 21.","DOI":"10.3390\/e21050485"},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"52","DOI":"10.1109\/TCOM.1967.1089532","article-title":"The divergence and Bhattacharyya distance measures in signal selection","volume":"15","author":"Kailath","year":"1967","journal-title":"IEEE Trans. Commun. Technol."},{"key":"ref_21","doi-asserted-by":"crossref","first-page":"24","DOI":"10.1016\/j.jmva.2013.03.010","article-title":"Distances between models of generalized order statistics","volume":"118","author":"Vuong","year":"2013","journal-title":"J. Multivar. Anal."},{"key":"ref_22","doi-asserted-by":"crossref","unstructured":"Nielsen, F. (2020). On a generalization of the Jensen\u2013Shannon divergence and the Jensen\u2013Shannon centroid. Entropy, 22.","DOI":"10.3390\/e22020221"},{"key":"ref_23","doi-asserted-by":"crossref","first-page":"303","DOI":"10.1007\/s00184-015-0556-6","article-title":"On local divergences between two probability measures","volume":"79","author":"Avlogiaris","year":"2016","journal-title":"Metrika"},{"key":"ref_24","doi-asserted-by":"crossref","first-page":"2053","DOI":"10.1016\/j.jmva.2008.02.004","article-title":"Robust parameter estimation with a small bias against heavy contamination","volume":"99","author":"Fujisawa","year":"2008","journal-title":"J. Multivar. Anal."},{"key":"ref_25","doi-asserted-by":"crossref","first-page":"631","DOI":"10.1214\/aoms\/1177728422","article-title":"Decision rules based on the distance, for problems of fit, two samples, and estimation","volume":"26","author":"Matusita","year":"1955","journal-title":"Ann. Math. Statist."},{"key":"ref_26","doi-asserted-by":"crossref","first-page":"181","DOI":"10.1007\/BF02911675","article-title":"On the notion of affinity of several distributions and some of its applications","volume":"19","author":"Matusita","year":"1967","journal-title":"Ann. Inst. Statist. Math."},{"key":"ref_27","doi-asserted-by":"crossref","first-page":"426","DOI":"10.1023\/A:1004100931499","article-title":"Asymptotic distribution of estimated affinity between multiparameter exponential families","volume":"52","author":"Garren","year":"2000","journal-title":"Ann. Inst. Statist. Math."},{"key":"ref_28","first-page":"2013","article-title":"Exponential family and Taneja\u2019s entropy","volume":"41","author":"Beitollahi","year":"2010","journal-title":"Appl. Math. Sci."},{"key":"ref_29","doi-asserted-by":"crossref","first-page":"032003","DOI":"10.1088\/1751-8113\/45\/3\/032003","article-title":"A closed-form expression for the Sharma\u2013Mittal entropy of exponential families","volume":"45","author":"Nielsen","year":"2012","journal-title":"J. Phys. A Math. Theor."},{"key":"ref_30","doi-asserted-by":"crossref","first-page":"71","DOI":"10.1016\/j.spl.2004.10.023","article-title":"Expressions for R\u00e9nyi and Shannon entropies for multivariate distributions","volume":"71","author":"Zografos","year":"2005","journal-title":"Statist. Probab. Lett."}],"container-title":["Entropy"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1099-4300\/23\/6\/726\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T06:11:55Z","timestamp":1760163115000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1099-4300\/23\/6\/726"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2021,6,8]]},"references-count":30,"journal-issue":{"issue":"6","published-online":{"date-parts":[[2021,6]]}},"alternative-id":["e23060726"],"URL":"https:\/\/doi.org\/10.3390\/e23060726","relation":{},"ISSN":["1099-4300"],"issn-type":[{"type":"electronic","value":"1099-4300"}],"subject":[],"published":{"date-parts":[[2021,6,8]]}}}