{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,24]],"date-time":"2026-03-24T13:03:57Z","timestamp":1774357437205,"version":"3.50.1"},"reference-count":16,"publisher":"MDPI AG","issue":"3","license":[{"start":{"date-parts":[[2011,3,3]],"date-time":"2011-03-03T00:00:00Z","timestamp":1299110400000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/3.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Entropy"],"abstract":"<jats:p>Kolmogorov complexity and Shannon entropy are conceptually different measures. However, for any recursive probability distribution, the expected value of Kolmogorov complexity equals its Shannon entropy, up to a constant. We study if a similar relationship holds for R\u00b4enyi and Tsallis entropies of order \u03b1, showing that it only holds for \u03b1 = 1. Regarding a time-bounded analogue relationship, we show that, for some distributions we have a similar result. We prove that, for universal time-bounded distribution mt(x), Tsallis and R\u00e9nyi entropies converge if and only if \u03b1 is greater than 1. We also establish the uniform continuity of these entropies.<\/jats:p>","DOI":"10.3390\/e13030595","type":"journal-article","created":{"date-parts":[[2011,3,5]],"date-time":"2011-03-05T12:18:45Z","timestamp":1299327525000},"page":"595-611","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":42,"title":["Entropy Measures vs. Kolmogorov Complexity"],"prefix":"10.3390","volume":"13","author":[{"given":"Andreia","family":"Teixeira","sequence":"first","affiliation":[{"name":"Computer Science Department, Faculty of Sciences, University of Porto, Rua Campo Alegre 1021\/1055, 4169-007 Porto, Portugal"},{"name":"Instituto de Telecomunica\u00e7\u00f5es, Av. Rovisco Pais 1, 1049-001 Lisboa, Portugal"}]},{"given":"Armando","family":"Matos","sequence":"additional","affiliation":[{"name":"Computer Science Department, Faculty of Sciences, University of Porto, Rua Campo Alegre 1021\/1055, 4169-007 Porto, Portugal"},{"name":"Laborat\u00f3rio de Intelig\u00eancia Artificial e Ci\u00eancia de Computadores, Rua Campo Alegre 1021\/1055, 4169-007 Porto, Portugal"}]},{"given":"Andr\u00e9","family":"Souto","sequence":"additional","affiliation":[{"name":"Computer Science Department, Faculty of Sciences, University of Porto, Rua Campo Alegre 1021\/1055, 4169-007 Porto, Portugal"},{"name":"Instituto de Telecomunica\u00e7\u00f5es, Av. Rovisco Pais 1, 1049-001 Lisboa, Portugal"}]},{"given":"Lu\u00eds","family":"Antunes","sequence":"additional","affiliation":[{"name":"Computer Science Department, Faculty of Sciences, University of Porto, Rua Campo Alegre 1021\/1055, 4169-007 Porto, Portugal"},{"name":"Instituto de Telecomunica\u00e7\u00f5es, Av. Rovisco Pais 1, 1049-001 Lisboa, Portugal"}]}],"member":"1968","published-online":{"date-parts":[[2011,3,3]]},"reference":[{"key":"ref_1","unstructured":"Li, M., and Vit\u00e1nyi, P. (2008). An Introduction to Kolmogorov Complexity and Its Applications, Springer Publishing Company, Incorporated. [3rd ed.]."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"379","DOI":"10.1002\/j.1538-7305.1948.tb01338.x","article-title":"A mathematical theory of communication","volume":"27","author":"Shannon","year":"1948","journal-title":"Bell Syst. Tech. J."},{"key":"ref_3","unstructured":"R\u00e9nyi, A. (July, January 20). On measures of entropy and information. Proceedings of the 4th Berkeley Symposium on Mathematics, Statistics and Probability, Statistical Laboratory of the University of California, Berkeley, CA, USA."},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"479","DOI":"10.1007\/BF01016429","article-title":"Possible generalization of Boltzmann-Gibbs statistics","volume":"52","author":"Tsallis","year":"1988","journal-title":"J. Stat. Phys."},{"key":"ref_5","unstructured":"Cachin, C. (1997). ETH Series in Information Security and Cryptography, Hartung-Gorre Verlag. [1st ed.]."},{"key":"ref_6","doi-asserted-by":"crossref","first-page":"840","DOI":"10.1214\/aop\/1176991250","article-title":"Kolmogorov\u2019s contributions to information theory and algorithmic complexity","volume":"17","author":"Cover","year":"1989","journal-title":"Ann. Probab."},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1016\/S0019-9958(64)90223-2","article-title":"A formal theory of inductive inference; Part I","volume":"7","author":"Solomonoff","year":"1964","journal-title":"Inform. Contr."},{"key":"ref_8","first-page":"1","article-title":"Three approaches to the quantitative definition of information","volume":"1","author":"Kolmogorov","year":"1965","journal-title":"Probl. Inform. Transm."},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"547","DOI":"10.1145\/321356.321363","article-title":"On the length of programs for computing finite binary sequences","volume":"13","author":"Chaitin","year":"1966","journal-title":"J. ACM"},{"key":"ref_10","unstructured":"Solovay, R.M. (1975). Draft of a paper (or series of papers) on Chaitin\u2019s work, IBM Thomas J. Watson Research Center, Yorktown Heights, New York, NY, USA. Unpublished manuscript."},{"key":"ref_11","first-page":"206","article-title":"Laws of information conservation (non-growth) and aspects of the foundation of probability theory","volume":"10","author":"Levin","year":"1974","journal-title":"Prob. Peredachi Inf."},{"key":"ref_12","first-page":"1477","article-title":"On the symmetry of algorithmic information","volume":"15","author":"Gacs","year":"1974","journal-title":"Soviet Mathematics Doklady"},{"key":"ref_13","unstructured":"Grunwald, P., and Vitanyi, P. (2004). Shannon information and Kolmogorov complexity. arXiv, arXiv:cs\/0410002v1."},{"key":"ref_14","doi-asserted-by":"crossref","first-page":"119","DOI":"10.1016\/0375-9601(95)00035-2","article-title":"Thermodynamic stability conditions for the Tsallis and R\u00e9nyi entropies","volume":"198","author":"Ramshaw","year":"1995","journal-title":"Phys. Lett. A"},{"key":"ref_15","doi-asserted-by":"crossref","unstructured":"Tadaki, K. (2008, January 6\u201311). The Tsallis entropy and the Shannon entropy of a universal probability. Proceedings of the International Symposium on Information Theory, Toronto, Canada. abs\/0805.0154.","DOI":"10.1109\/ISIT.2008.4595362"},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"171","DOI":"10.1007\/s11005-007-0155-1","article-title":"Uniform estimates on the Tsallis entropies","volume":"80","author":"Zhang","year":"2007","journal-title":"Lett. Math. Phys."}],"container-title":["Entropy"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1099-4300\/13\/3\/595\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T21:55:25Z","timestamp":1760219725000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1099-4300\/13\/3\/595"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2011,3,3]]},"references-count":16,"journal-issue":{"issue":"3","published-online":{"date-parts":[[2011,3]]}},"alternative-id":["e13030595"],"URL":"https:\/\/doi.org\/10.3390\/e13030595","relation":{},"ISSN":["1099-4300"],"issn-type":[{"value":"1099-4300","type":"electronic"}],"subject":[],"published":{"date-parts":[[2011,3,3]]}}}