{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T00:35:26Z","timestamp":1760142926011,"version":"build-2065373602"},"reference-count":22,"publisher":"MDPI AG","issue":"1","license":[{"start":{"date-parts":[[2024,1,19]],"date-time":"2024-01-19T00:00:00Z","timestamp":1705622400000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Entropy"],"abstract":"<jats:p>A new information theoretic condition is presented for reconstructing a discrete random variable X based on the knowledge of a set of discrete functions of X. The reconstruction condition is derived from Shannon\u2019s 1953 lattice theory with two entropic metrics of Shannon and Rajski. Because such a theoretical material is relatively unknown and appears quite dispersed in different references, we first provide a synthetic description (with complete proofs) of its concepts, such as total, common, and complementary information. The definitions and properties of the two entropic metrics are also fully detailed and shown to be compatible with the lattice structure. A new geometric interpretation of such a lattice structure is then investigated, which leads to a necessary (and sometimes sufficient) condition for reconstructing the discrete random variable X given a set {X1,\u2026,Xn} of elements in the lattice generated by X. Intuitively, the components X1,\u2026,Xn of the original source of information X should not be globally \u201ctoo far away\u201d from X in the entropic distance in order that X is reconstructable. In other words, these components should not overall have too low of a dependence on X; otherwise, reconstruction is impossible. These geometric considerations constitute a starting point for a possible novel \u201cperfect reconstruction theory\u201d, which needs to be further investigated and improved along these lines. Finally, this condition is illustrated in five specific examples of perfect reconstruction problems: the reconstruction of a symmetric random variable from the knowledge of its sign and absolute value, the reconstruction of a word from a set of linear combinations, the reconstruction of an integer from its prime signature (fundamental theorem of arithmetic) and from its remainders modulo a set of coprime integers (Chinese remainder theorem), and the reconstruction of the sorting permutation of a list from a minimal set of pairwise comparisons.<\/jats:p>","DOI":"10.3390\/e26010086","type":"journal-article","created":{"date-parts":[[2024,1,19]],"date-time":"2024-01-19T05:46:44Z","timestamp":1705643204000},"page":"86","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":1,"title":["An Information Theoretic Condition for Perfect Reconstruction"],"prefix":"10.3390","volume":"26","author":[{"given":"Idris","family":"Delsol\u00a0","sequence":"first","affiliation":[{"name":"Laboratoire de Traitement et Communication de l\u2019Information, T\u00e9l\u00e9com Paris, Institut Polytechnique de Paris, 91120 Palaiseau, France"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-8681-8916","authenticated-orcid":false,"given":"Olivier","family":"Rioul\u00a0","sequence":"additional","affiliation":[{"name":"Laboratoire de Traitement et Communication de l\u2019Information, T\u00e9l\u00e9com Paris, Institut Polytechnique de Paris, 91120 Palaiseau, France"}]},{"ORCID":"https:\/\/orcid.org\/0009-0000-0729-6965","authenticated-orcid":false,"given":"Julien","family":"B\u00e9guinot","sequence":"additional","affiliation":[{"name":"Laboratoire de Traitement et Communication de l\u2019Information, T\u00e9l\u00e9com Paris, Institut Polytechnique de Paris, 91120 Palaiseau, France"}]},{"given":"Victor","family":"Rabiet\u00a0","sequence":"additional","affiliation":[{"name":"Laboratoire de Traitement et Communication de l\u2019Information, T\u00e9l\u00e9com Paris, Institut Polytechnique de Paris, 91120 Palaiseau, France"},{"name":"D\u00e9partement de Math\u00e9matiques et Applications, \u00c9cole Nationale Sup\u00e9rieure, 75005 Paris, France"}]},{"given":"Antoine","family":"Souloumiac\u00a0","sequence":"additional","affiliation":[{"name":"CEA-List, Universit\u00e9 Paris-Saclay, 91120 Palaiseau, France"}]}],"member":"1968","published-online":{"date-parts":[[2024,1,19]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"379","DOI":"10.1002\/j.1538-7305.1948.tb01338.x","article-title":"A mathematical theory of communication","volume":"27","author":"Shannon","year":"1948","journal-title":"Bell Syst. Tech. J."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"105","DOI":"10.1109\/TIT.1953.1188572","article-title":"The lattice theory of information","volume":"1","author":"Shannon","year":"1953","journal-title":"Trans. Ire Prof. Group Inf. Theory"},{"key":"ref_3","unstructured":"Fano, R.M. (2001). Interview by Aftab, Cheung, Kim, Thkkar, Yeddanapudi, 6.933 Project History, Massachusetts Institute of Technology."},{"key":"ref_4","unstructured":"Fano, R.M. (1952). Class Notes for Course 6.574: Transmission of Information, MIT."},{"key":"ref_5","first-page":"383","article-title":"A history of the theory of information","volume":"98","author":"Cherry","year":"1951","journal-title":"Proc. Inst. Electr. Eng."},{"key":"ref_6","doi-asserted-by":"crossref","first-page":"3","DOI":"10.1109\/TIT.1956.1056774","article-title":"The bandwagon (editorial)","volume":"Volume 2","author":"Shannon","year":"1956","journal-title":"IRE Transactions on Information Theory"},{"key":"ref_7","unstructured":"Shannon, C.E. (September, January 30). Some Topics on Information Theory. Proceedings of the International Congress of Mathematicians, Cambridge, MA, USA. Volume II."},{"key":"ref_8","unstructured":"Rioul, O., B\u00e9guinot, J., Rabiet, V., and Souloumiac, A. (2022, January 6\u20139). La v\u00e9ritable (et m\u00e9connue) th\u00e9orie de l\u2019information de Shannon. Proceedings of the 28e Colloque GRETSI 2022, Nancy, France."},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"371","DOI":"10.1016\/S0019-9958(61)80055-7","article-title":"A metric space of discrete probability distributions","volume":"4","author":"Rajski","year":"1961","journal-title":"Inf. Control"},{"key":"ref_10","first-page":"149","article-title":"Common information is far less than mutual information","volume":"2","year":"1973","journal-title":"Probl. Control Inf. Theory"},{"key":"ref_11","unstructured":"Gamal, A.E., and Kim, Y.-H. (2011). Network Information Theory, Cambridge University Press."},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"163","DOI":"10.1109\/TIT.1975.1055346","article-title":"The common information of two dependent random variables","volume":"21","author":"Wyner","year":"1975","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_13","doi-asserted-by":"crossref","first-page":"443","DOI":"10.2996\/kmj\/1138846220","article-title":"Entropy and semivaluations on semilattices","volume":"22","author":"Nakamura","year":"1970","journal-title":"Kodai Math. Semin. Rep."},{"key":"ref_14","unstructured":"Yeung, R.W. (2008). Information Theory and Network Coding, Springer."},{"key":"ref_15","doi-asserted-by":"crossref","first-page":"403","DOI":"10.1016\/S0019-9958(73)90554-8","article-title":"A note on entropy metrics","volume":"22","author":"Horibe","year":"1973","journal-title":"Inf. Control"},{"key":"ref_16","first-page":"241","article-title":"Distribution de la flore alpine dans le bassin des Dranses et dans quelques r\u00e9gions voisines","volume":"37","author":"Jaccard","year":"1901","journal-title":"Bull. Soci\u00e9t\u00e9 Vaudoise Des Sci. Nat."},{"key":"ref_17","doi-asserted-by":"crossref","unstructured":"Csisz\u00e1r, I., and K\u00f6rner, J. (2011). Information Theory. Coding Theorems for DiscreteMemoryless Systems, Cambridge University Press. [2nd ed.].","DOI":"10.1017\/CBO9780511921889"},{"key":"ref_18","doi-asserted-by":"crossref","first-page":"576","DOI":"10.3758\/BF03207491","article-title":"Information measurement of distinctiveness and similarity","volume":"44","author":"Donderi","year":"1988","journal-title":"Percept. Psychophys."},{"key":"ref_19","doi-asserted-by":"crossref","first-page":"823","DOI":"10.1068\/p5249","article-title":"An information theory analysis of visual complexity and dissimilarity","volume":"35","author":"Donderi","year":"2006","journal-title":"Perception"},{"key":"ref_20","unstructured":"Rioul, O. (2007). Th\u00e9orie de l\u2019information et du Codage, Hermes Science\u2014Lavoisier."},{"key":"ref_21","doi-asserted-by":"crossref","first-page":"3","DOI":"10.1109\/TIT.1973.1054955","article-title":"The early days of information theory","volume":"19","author":"Pierce","year":"1973","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_22","doi-asserted-by":"crossref","first-page":"404","DOI":"10.1017\/S0960129513000649","article-title":"Algebraic foundations for quantitative information flow","volume":"25","author":"Malacaria","year":"2015","journal-title":"Math. Struct. Comput. Sci."}],"container-title":["Entropy"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1099-4300\/26\/1\/86\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,10]],"date-time":"2025-10-10T13:45:56Z","timestamp":1760103956000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1099-4300\/26\/1\/86"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2024,1,19]]},"references-count":22,"journal-issue":{"issue":"1","published-online":{"date-parts":[[2024,1]]}},"alternative-id":["e26010086"],"URL":"https:\/\/doi.org\/10.3390\/e26010086","relation":{},"ISSN":["1099-4300"],"issn-type":[{"type":"electronic","value":"1099-4300"}],"subject":[],"published":{"date-parts":[[2024,1,19]]}}}