{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,5,7]],"date-time":"2026-05-07T00:36:55Z","timestamp":1778114215401,"version":"3.51.4"},"reference-count":60,"publisher":"MDPI AG","issue":"3","license":[{"start":{"date-parts":[[2008,9,19]],"date-time":"2008-09-19T00:00:00Z","timestamp":1221782400000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/3.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Entropy"],"abstract":"<jats:p>Axiomatic characterizations of Shannon entropy, Kullback I-divergence, and some generalized information measures are surveyed. Three directions are treated: (A) Characterization of functions of probability distributions suitable as information measures. (B) Characterization of set functions on the subsets of {1; : : : ;N} representable by joint entropies of components of an N-dimensional random vector. (C) Axiomatic characterization of MaxEnt and related inference rules. The paper concludes with a brief discussion of the relevance of the axiomatic approach for information theory.<\/jats:p>","DOI":"10.3390\/e10030261","type":"journal-article","created":{"date-parts":[[2008,9,29]],"date-time":"2008-09-29T05:43:40Z","timestamp":1222667020000},"page":"261-273","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":209,"title":["Axiomatic Characterizations of Information Measures"],"prefix":"10.3390","volume":"10","author":[{"given":"Imre","family":"Csisz\u00e1r","sequence":"first","affiliation":[{"name":"R\u00e9nyi Institute of Mathematics, Hungarian Academy of Sciences, P.O.Box 127, H1364 Budapest, Hungary"}]}],"member":"1968","published-online":{"date-parts":[[2008,9,19]]},"reference":[{"key":"ref_1","unstructured":"Acz\u00e9l, J., and Dar\u00f3czy, Z. (1975). On Measures of Information and Their Characterizations, Academic Press."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"149","DOI":"10.1051\/ita\/1978120201491","article-title":"A mixed theory of information I","volume":"12","year":"1978","journal-title":"RAIRO Inform. Theory"},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"131","DOI":"10.2307\/1426210","article-title":"Why Shannon and Hartley entropies are \u201cnatural\u201d","volume":"6","author":"Forte","year":"1974","journal-title":"Adv. Appl. Probab."},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"4198","DOI":"10.1109\/TIT.2006.879972","article-title":"An interpretation of identification entropy","volume":"52","author":"Ahlswede","year":"2006","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"131","DOI":"10.1111\/j.2517-6161.1966.tb00626.x","article-title":"A general class of coefficients of divergence of one distribution from another","volume":"28","author":"Ali","year":"1966","journal-title":"J. Roy. Statist. Soc. B"},{"key":"ref_6","doi-asserted-by":"crossref","first-page":"181","DOI":"10.1016\/S0019-9958(71)90065-9","article-title":"Information-theoretic considerations on estimation problems","volume":"19","author":"Arimoto","year":"1971","journal-title":"Information and Control"},{"key":"ref_7","unstructured":"Csisz\u00e1r, I., and Elias, P. (1977). Topics in Information Theory, North Holland. Colloq. Math. Soc. J. Bolyai 16."},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"227","DOI":"10.1016\/S0019-9958(78)90587-9","article-title":"f-entropies, probability of error, and feature selection","volume":"39","year":"1978","journal-title":"Information and Control"},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"1915","DOI":"10.1109\/18.476316","article-title":"Generalized privacy amplification","volume":"41","author":"Bennett","year":"1995","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_10","first-page":"99","article-title":"On a measure of divergence between two statistical populations defined by their probability distributions","volume":"35","author":"Bhattacharyya","year":"1943","journal-title":"Bull. Calcutta Math. Soc."},{"key":"ref_11","doi-asserted-by":"crossref","first-page":"200","DOI":"10.1016\/0041-5553(67)90040-7","article-title":"The relaxation method of finding the common point of convex sets and its application to the solution of problems in convex programming","volume":"7","author":"Bregman","year":"1967","journal-title":"USSR Comp. Math. and Math. Phys."},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"423","DOI":"10.1016\/S0019-9958(65)90332-3","article-title":"A coding theorem and R\u00e9nyi\u2019s entropy","volume":"8","author":"Campbell","year":"1965","journal-title":"Information and Control"},{"key":"ref_13","doi-asserted-by":"crossref","first-page":"7","DOI":"10.1017\/S0950184300003244","article-title":"On a functional equation","volume":"43","author":"Chaundry","year":"1960","journal-title":"Edinburgh Mat. Notes"},{"key":"ref_14","first-page":"85","article-title":"Eine informationstheorische Ungleichung und ihre Anwendung auf den Beweis der Ergodizit\u00e4t von Markoffschen Ketten","volume":"8","year":"1963","journal-title":"Publ. Math. Inst. Hungar. Acad. Sci."},{"key":"ref_15","first-page":"299","article-title":"Information-type measures of difference of probability distributions and indirect observations","volume":"2","year":"1967","journal-title":"Studia Sci. Math. Hungar."},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"191","DOI":"10.1007\/BF02018661","article-title":"A class of measures of informativity of observation channels","volume":"2","year":"1972","journal-title":"Periodica Math. Hungar."},{"key":"ref_17","unstructured":"Csisz\u00e1r, I. (1977). Trans. 7th Prague Conference on Inf. Theory, etc., Academia."},{"key":"ref_18","first-page":"2032","article-title":"Why least squares and maximum entropy? An axiomatic approach to inference for linear inverse problems","volume":"19","year":"1991","journal-title":"Ann. Statist."},{"key":"ref_19","doi-asserted-by":"crossref","first-page":"26","DOI":"10.1109\/18.370121","article-title":"Generalized cutoff rates and R\u00e9nyi information measures","volume":"41","year":"1995","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"381","DOI":"10.1007\/BF00533413","article-title":"\u00dcber die gemeinsame Charakterisierung der zu den nicht vollst\u00e4ndigen Verteilungen geh\u00f6rigen Entropien von Shannon und von R\u00e9nyi","volume":"1","year":"1963","journal-title":"Z. Wahrscheinlichkeitsth. Verw. Gebiete"},{"key":"ref_21","doi-asserted-by":"crossref","first-page":"203","DOI":"10.1007\/BF01897038","article-title":"\u00dcber Mittelwerte und Entropien vollst\u00e4ndiger Wahrscheinlichkeitsverteilungen","volume":"15","year":"1964","journal-title":"Acta Math. Acad. Sci. Hungar."},{"key":"ref_22","doi-asserted-by":"crossref","first-page":"36","DOI":"10.1016\/S0019-9958(70)80040-7","article-title":"Generalized information functions","volume":"16","year":"1970","journal-title":"Information and Control"},{"key":"ref_23","first-page":"11","article-title":"On the measurable solutions of a functional equation","volume":"34","year":"1971","journal-title":"Acta Math. Acad. Sci. Hungar."},{"key":"ref_24","unstructured":"Gyires, B. (1979). Analytic Function Methods in Probability and Statistics, North Holland. Colloq. Math. Soc. J. Bolyai 21."},{"key":"ref_25","doi-asserted-by":"crossref","first-page":"149","DOI":"10.1016\/S0019-9958(75)90514-8","article-title":"The role of boundedness in characterizing Shannon entropy","volume":"29","author":"Diderrich","year":"1975","journal-title":"Information and Control"},{"key":"ref_26","doi-asserted-by":"crossref","unstructured":"Ebanks, B., Sahoo, P., and Sander, W. (1998). Characterizations of Information Measures, World Scientific.","DOI":"10.1142\/9789812817167"},{"key":"ref_27","first-page":"227","article-title":"On the concept of entropy of a finite probability scheme (in Russian)","volume":"11","author":"Faddeev","year":"1956","journal-title":"Uspehi Mat. Nauk"},{"key":"ref_28","doi-asserted-by":"crossref","first-page":"199","DOI":"10.1007\/BF02614249","article-title":"On the inequality \u2211pif(pi) \u2265 \u2211pif(qi)","volume":"18","author":"Fischer","year":"1972","journal-title":"Metrika"},{"key":"ref_29","unstructured":"Forte, B. (1975). Conv. Inform. Teor., Rome 1973, Academic Press. Symposia Math. 15."},{"key":"ref_30","first-page":"1367","article-title":"Game theory, maximum entropy, minimum discrepancy and robust Bayesian decision theory","volume":"32","author":"Dawid","year":"2004","journal-title":"Ann. Statist."},{"key":"ref_31","first-page":"30","article-title":"Quantification method of classification processes. Concept of structural a-entropy","volume":"3","author":"Havrda","year":"1967","journal-title":"Kybernetika"},{"key":"ref_32","doi-asserted-by":"crossref","first-page":"131","DOI":"10.4064\/cm-9-1-131-150","article-title":"Information without probability","volume":"9","author":"Ingarden","year":"1962","journal-title":"Colloq. Math."},{"key":"ref_33","doi-asserted-by":"crossref","first-page":"620","DOI":"10.1103\/PhysRev.106.620","article-title":"Information theory and statistical mechanics","volume":"106","author":"Jaynes","year":"1957","journal-title":"Phys. Rev."},{"key":"ref_34","doi-asserted-by":"crossref","first-page":"23","DOI":"10.1109\/18.50370","article-title":"General entropy criteria for inverse problems, with applications to data compression, pattern classification and cluster analysis","volume":"36","author":"Jones","year":"1990","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_35","first-page":"110","article-title":"Information et probabilit\u00e9","volume":"265","author":"Forte","year":"1967","journal-title":"C. R. Acad. Sci. Paris A"},{"key":"ref_36","doi-asserted-by":"crossref","first-page":"303","DOI":"10.1090\/S0002-9939-1973-0312110-1","article-title":"Measurable solutions of functional equations related to information theory","volume":"38","author":"Kannappan","year":"1973","journal-title":"Proc. Amer. Math. Soc."},{"key":"ref_37","doi-asserted-by":"crossref","first-page":"105","DOI":"10.4064\/ap-30-1-105-112","article-title":"A functional equation and its applications in information theory","volume":"30","author":"Kannappan","year":"1974","journal-title":"Ann. Polon. Math."},{"key":"ref_38","first-page":"861","article-title":"A new invariant for transitive dynamical systems (in Russian)","volume":"119","author":"Kolmogorov","year":"1958","journal-title":"Dokl. Akad. Nauk SSSR"},{"key":"ref_39","unstructured":"Kullback, S. (1959). Information Theory and Statistics, Wiley."},{"key":"ref_40","doi-asserted-by":"crossref","first-page":"79","DOI":"10.1214\/aoms\/1177729694","article-title":"On information and sufficiency","volume":"22","author":"Kullback","year":"1951","journal-title":"Ann. Math. Statist."},{"key":"ref_41","doi-asserted-by":"crossref","first-page":"415","DOI":"10.1214\/aoms\/1177703765","article-title":"On the axioms of information theory","volume":"35","author":"Lee","year":"1964","journal-title":"Ann. Math. Statist."},{"key":"ref_42","first-page":"1","article-title":"Studies in statistical dynamics","volume":"38","author":"Linhard","year":"1971","journal-title":"Kong.Danske Vid. Selskab Mat-fys. Med."},{"key":"ref_43","doi-asserted-by":"crossref","first-page":"445","DOI":"10.1007\/BF01895147","article-title":"On the bounded solutions of a functional equation","volume":"37","author":"Maksa","year":"1981","journal-title":"Acta Math. Acad. Sci. Hungar."},{"key":"ref_44","unstructured":"Mat\u00fa\u0161, F. Infinitely many information inequalities. IEEE ISIT07 Nice, Symposium Proceedings."},{"key":"ref_45","first-page":"273","article-title":"Thermodynamik quantenmechanischer Gesamtheiten","volume":"1","author":"Neumann","year":"1927","journal-title":"G\u00f6tt. Nachr."},{"key":"ref_46","first-page":"183","article-title":"A note on the inevitability of maximum entropy","volume":"4","author":"Paris","year":"1990","journal-title":"Int\u2019l J. Inexact Reasoning"},{"key":"ref_47","unstructured":"Pippenger, N. (1986, January Sep.). What are the laws of information theory?. Special Problems on Communication and Computation Conference, Palo Alto, CA."},{"key":"ref_48","first-page":"547","article-title":"On measures of entropy and information","volume":"Vol. 1","year":"1961","journal-title":"Proc. 4th Berkeley Symp. Math. Statist. Probability, 1960"},{"key":"ref_49","doi-asserted-by":"crossref","first-page":"1","DOI":"10.2307\/1401301","article-title":"On the foundations of information theory","volume":"33","year":"1965","journal-title":"Rev. Inst. Internat. Stat."},{"key":"ref_50","first-page":"11","article-title":"On the probability of large deviations of random variables (in Russian)","volume":"42","author":"Sanov","year":"1957","journal-title":"Mat. Sbornik"},{"key":"ref_51","first-page":"3","article-title":"Contribution aux applications statistiques de la th\u00e9orie de l\u2019information","volume":"3","year":"1954","journal-title":"Publ. Inst. Statist. Univ. Paris"},{"key":"ref_52","doi-asserted-by":"crossref","first-page":"379","DOI":"10.1002\/j.1538-7305.1948.tb01338.x","article-title":"A mathematical theory of communication","volume":"27","author":"Shannon","year":"1948","journal-title":"Bell System Tech. J."},{"key":"ref_53","doi-asserted-by":"crossref","first-page":"26","DOI":"10.1109\/TIT.1980.1056144","article-title":"Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy","volume":"26","author":"Shore","year":"1980","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_54","doi-asserted-by":"crossref","first-page":"149","DOI":"10.1007\/BF00537520","article-title":"Information radius","volume":"14","author":"Sibson","year":"1969","journal-title":"Z. Wahrscheinlichkeitsth. Verw. Gebiete"},{"key":"ref_55","doi-asserted-by":"crossref","first-page":"479","DOI":"10.1007\/BF01016429","article-title":"Possible generalizations of the Boltzmann-Gibbs statistics","volume":"52","author":"Tsallis","year":"1988","journal-title":"J. Statist. Phys."},{"key":"ref_56","doi-asserted-by":"crossref","first-page":"297","DOI":"10.7146\/math.scand.a-10555","article-title":"A new derivation of the information function","volume":"6","author":"Tverberg","year":"1958","journal-title":"Math. Scand."},{"key":"ref_57","first-page":"9","article-title":"Bounds on the minimal error probability for testing a finite or countable number of hy- potheses (in Russian)","volume":"4","author":"Vajda","year":"1968","journal-title":"Probl. Inform. Transmission"},{"key":"ref_58","unstructured":"Wald, A. (1947). Sequential Analysis, Wiley."},{"key":"ref_59","doi-asserted-by":"crossref","unstructured":"Yeung, R.W. (2002). A First Course in Information Theory, Kluwer.","DOI":"10.1007\/978-1-4419-8608-5"},{"key":"ref_60","doi-asserted-by":"crossref","first-page":"1440","DOI":"10.1109\/18.681320","article-title":"On characterizations of entropy function via information inequalities","volume":"44","author":"Zhang","year":"1998","journal-title":"IEEE Trans. Inf. Theory"}],"container-title":["Entropy"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1099-4300\/10\/3\/261\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T22:20:58Z","timestamp":1760221258000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1099-4300\/10\/3\/261"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2008,9,19]]},"references-count":60,"journal-issue":{"issue":"3","published-online":{"date-parts":[[2008,9]]}},"alternative-id":["e10030261"],"URL":"https:\/\/doi.org\/10.3390\/e10030261","relation":{},"ISSN":["1099-4300"],"issn-type":[{"value":"1099-4300","type":"electronic"}],"subject":[],"published":{"date-parts":[[2008,9,19]]}}}