{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,1,27]],"date-time":"2026-01-27T11:49:24Z","timestamp":1769514564926,"version":"3.49.0"},"reference-count":10,"publisher":"MDPI AG","issue":"11","license":[{"start":{"date-parts":[[2011,11,24]],"date-time":"2011-11-24T00:00:00Z","timestamp":1322092800000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/3.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Entropy"],"abstract":"<jats:p>There are numerous characterizations of Shannon entropy and Tsallis entropy as measures of information obeying certain properties. Using work by Faddeev and Furuichi, we derive a very simple characterization. Instead of focusing on the entropy of a probability measure on a finite set, this characterization focuses on the \u201cinformation loss\u201d, or change in entropy, associated with a measure-preserving function. Information loss is a special case of conditional entropy: namely, it is the entropy of a random variable conditioned on some function of that variable. We show that Shannon entropy gives the only concept of information loss that is functorial, convex-linear and continuous. This characterization naturally generalizes to Tsallis entropy as well.<\/jats:p>","DOI":"10.3390\/e13111945","type":"journal-article","created":{"date-parts":[[2011,11,24]],"date-time":"2011-11-24T10:28:35Z","timestamp":1322130515000},"page":"1945-1957","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":81,"title":["A Characterization of Entropy in Terms of Information Loss"],"prefix":"10.3390","volume":"13","author":[{"given":"John C.","family":"Baez","sequence":"first","affiliation":[{"name":"Department of Mathematics, University of California, Riverside, CA 92521, USA"},{"name":"Centre for Quantum Technologies, National University of Singapore, 117543, Singapore"}]},{"given":"Tobias","family":"Fritz","sequence":"additional","affiliation":[{"name":"Institut de Ci\u00e8ncies Fot\u00f2niques, Mediterranean Technology Park, 08860 Castelldefels (Barcelona), Spain"}]},{"given":"Tom","family":"Leinster","sequence":"additional","affiliation":[{"name":"School of Mathematics and Statistics, University of Glasgow, Glasgow G12 8QW, UK"}]}],"member":"1968","published-online":{"date-parts":[[2011,11,24]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"379","DOI":"10.1002\/j.1538-7305.1948.tb01338.x","article-title":"A mathematical theory of communication","volume":"27","author":"Shannon","year":"1948","journal-title":"Bell Syst. Tech. J."},{"key":"ref_2","unstructured":"Acz\u00e9l, J., and Dar\u00f3czy, Z. (1975). Mathematics in Science and Engineering, Academic Press."},{"key":"ref_3","unstructured":"Mac Lane, S. (1971). Graduate Texts in Mathematics 5, Springer."},{"key":"ref_4","first-page":"227","article-title":"On the concept of entropy of a finite probabilistic scheme","volume":"1","author":"Faddeev","year":"1956","journal-title":"Uspehi Mat. Nauk (N.S.)"},{"key":"ref_5","unstructured":"R\u00e9nyi, A. (30,, January June). Measures of information and entropy. Proceedings of the 4th Berkeley Symposium on Mathematical Statistics and Probability, Statistical Laboratory of the University of California, Berkeley, CA, USA."},{"key":"ref_6","unstructured":"Cover, T.M., and Thomas, J.A. (2006). Elements of Information Theory,  Wiley-Interscience. Wiley Series in Telecommunications and Signal Processing."},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"479","DOI":"10.1007\/BF01016429","article-title":"Possible generalization of Boltzmann-Gibbs statistics","volume":"52","author":"Tsallis","year":"1988","journal-title":"J. Stat. Phys."},{"key":"ref_8","first-page":"30","article-title":"Quantification method of classification processes: concept of structural \u03b1-entropy","volume":"3","author":"Havrda","year":"1967","journal-title":"Kybernetika"},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"548","DOI":"10.1080\/01621459.1982.10477845","article-title":"Diversity as a concept and its measurement","volume":"77","author":"Patil","year":"1982","journal-title":"J. Am. Stat. Assoc."},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"3638","DOI":"10.1109\/TIT.2005.855606","article-title":"On uniqueness theorems for Tsallis entropy and Tsallis relative entropy","volume":"51","author":"Furuichi","year":"2005","journal-title":"IEEE Trans. Infor. Theor."}],"container-title":["Entropy"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1099-4300\/13\/11\/1945\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T21:58:07Z","timestamp":1760219887000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1099-4300\/13\/11\/1945"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2011,11,24]]},"references-count":10,"journal-issue":{"issue":"11","published-online":{"date-parts":[[2011,11]]}},"alternative-id":["e13111945"],"URL":"https:\/\/doi.org\/10.3390\/e13111945","relation":{},"ISSN":["1099-4300"],"issn-type":[{"value":"1099-4300","type":"electronic"}],"subject":[],"published":{"date-parts":[[2011,11,24]]}}}