{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,1,29]],"date-time":"2026-01-29T11:45:56Z","timestamp":1769687156333,"version":"3.49.0"},"reference-count":53,"publisher":"MDPI AG","issue":"2","license":[{"start":{"date-parts":[[2016,1,26]],"date-time":"2016-01-26T00:00:00Z","timestamp":1453766400000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Entropy"],"abstract":"<jats:p>The interactions between three or more random variables are often nontrivial, poorly understood and, yet, are paramount for future advances in fields such as network information theory, neuroscience and genetics. In this work, we analyze these interactions as different modes of information sharing. Towards this end, and in contrast to most of the literature that focuses on analyzing the mutual information, we introduce an axiomatic framework for decomposing the joint entropy that characterizes the various ways in which random variables can share information. Our framework distinguishes between interdependencies where the information is shared redundantly and synergistic interdependencies where the sharing structure exists in the whole, but not between the parts. The key contribution of our approach is to focus on symmetric properties of this sharing, which do not depend on a specific point of view for differentiating roles between its components. We show that our axioms determine unique formulas for all of the terms of the proposed decomposition for systems of three variables in several cases of interest. Moreover, we show how these results can be applied to several network information theory problems, providing a more intuitive understanding of their fundamental limits.<\/jats:p>","DOI":"10.3390\/e18020038","type":"journal-article","created":{"date-parts":[[2016,1,26]],"date-time":"2016-01-26T10:00:42Z","timestamp":1453802442000},"page":"38","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":41,"title":["Understanding Interdependency Through Complex Information Sharing"],"prefix":"10.3390","volume":"18","author":[{"given":"Fernando","family":"Rosas","sequence":"first","affiliation":[{"name":"Departement Elektrotechniek, KU Leuven, Leuven 3001, Belgium"}]},{"given":"Vasilis","family":"Ntranos","sequence":"additional","affiliation":[{"name":"Department of Electrical Engineering &amp; Computer Sciences, UC Berkeley, Berkeley, CA 94720, USA"}]},{"given":"Christopher","family":"Ellison","sequence":"additional","affiliation":[{"name":"Center for Complexity and Collective Computation, University of Wisconsin-Madison, Madison, WI 53706, USA"}]},{"given":"Sofie","family":"Pollin","sequence":"additional","affiliation":[{"name":"Departement Elektrotechniek, KU Leuven, Leuven 3001, Belgium"}]},{"given":"Marian","family":"Verhelst","sequence":"additional","affiliation":[{"name":"Departement Elektrotechniek, KU Leuven, Leuven 3001, Belgium"}]}],"member":"1968","published-online":{"date-parts":[[2016,1,26]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","unstructured":"Kaneko, K. (2006). Life: An Introduction to Complex Systems Biology, Springer-Verlag.","DOI":"10.1007\/978-3-540-32667-0"},{"key":"ref_2","unstructured":"Perrings, C. (2005). Economy and Environment: A Theoretical Essay on the Interdependence of Economic and Environmental Systems, Cambridge University Press."},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"2621","DOI":"10.1162\/089976600300014872","article-title":"Neural coding: Higher-order temporal patterns in the neurostatistics of cell assemblies","volume":"12","author":"Martignon","year":"2000","journal-title":"Neural Comput."},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"50","DOI":"10.1186\/1752-0509-2-50","article-title":"Can single knockouts accurately single out gene functions?","volume":"2","author":"Deutscher","year":"2008","journal-title":"BMC Syst. Biol."},{"key":"ref_5","doi-asserted-by":"crossref","unstructured":"Anand, K., and Bianconi, G. (2009). Entropy measures for networks: Toward an information theory of complex topologies. Phys. Rev. E, 80.","DOI":"10.1103\/PhysRevE.80.045102"},{"key":"ref_6","doi-asserted-by":"crossref","first-page":"70","DOI":"10.1109\/MSP.2006.1657818","article-title":"Sensing reality and communicating bits: A dangerous liaison","volume":"23","author":"Gastpar","year":"2006","journal-title":"IEEE Signal Process. Mag."},{"key":"ref_7","unstructured":"Casella, G., and Berger, R.L. (2002). Statistical Inference, Duxbury Press."},{"key":"ref_8","unstructured":"Cover, T.M., and Thomas, J.A. (1991). Elements of Information Theory, John Wiley."},{"key":"ref_9","unstructured":"Senge, P.M., Smith, B., Kruschwitz, N., Laur, J., and Schley, S. (2008). The Necessary Revolution: How Individuals and Organizations Are Working Together to Create a Sustainable World, Crown Business."},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"1701","DOI":"10.1109\/18.930911","article-title":"Information geometry on hierarchy of probability distributions","volume":"47","author":"Amari","year":"2001","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_11","unstructured":"Williams, P.L., and Beer, R.D. (2010). Nonnegative decomposition of multivariate information."},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"3501","DOI":"10.3390\/e17053501","article-title":"Information Decomposition and Synergy","volume":"17","author":"Olbrich","year":"2015","journal-title":"Entropy"},{"key":"ref_13","doi-asserted-by":"crossref","first-page":"823","DOI":"10.1007\/BF01025996","article-title":"Mutual information functions versus correlation functions","volume":"60","author":"Li","year":"1990","journal-title":"J. Stat. Phys."},{"key":"ref_14","doi-asserted-by":"crossref","unstructured":"Jaynes, E.T. (2003). Probability Theory: The Logic of Science, Cambridge University Press.","DOI":"10.1017\/CBO9780511790423"},{"key":"ref_15","doi-asserted-by":"crossref","first-page":"1152","DOI":"10.1063\/1.1721463","article-title":"The negentropy principle of information","volume":"24","author":"Brillouin","year":"1953","journal-title":"J. Appl. Phys."},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"66","DOI":"10.1147\/rd.41.0066","article-title":"Information theoretical analysis of multivariate correlation","volume":"4","author":"Watanabe","year":"1960","journal-title":"IBM J. Res. Dev."},{"key":"ref_17","unstructured":"Studen\u1ef3, M., and Vejnarov\u00e1, J. (1998). Learning in Graphical Models, Springer Netherlands."},{"key":"ref_18","doi-asserted-by":"crossref","unstructured":"Schneidman, E., Still, S., Berry, M.J., and Bialek, W. (2003). Network information and connected correlations. Phys. Rev. Lett., 91.","DOI":"10.1103\/PhysRevLett.91.238701"},{"key":"ref_19","doi-asserted-by":"crossref","first-page":"1007","DOI":"10.1038\/nature04701","article-title":"Weak pairwise correlations imply strongly correlated network states in a neural population","volume":"440","author":"Schneidman","year":"2006","journal-title":"Nature"},{"key":"ref_20","doi-asserted-by":"crossref","unstructured":"Roudi, Y., Nirenberg, S., and Latham, P.E. (2009). Pairwise Maximum Entropy Models for Studying Large Biological Systems: When They Can Work and When They Can\u2019t. PLoS Comput. Biol., 5.","DOI":"10.1371\/journal.pcbi.1000380"},{"key":"ref_21","doi-asserted-by":"crossref","first-page":"4786","DOI":"10.1073\/pnas.1118633109","article-title":"Statistical mechanics for natural flocks of birds","volume":"109","author":"Bialek","year":"2012","journal-title":"Proc. Natl. Acad. Sci. USA"},{"key":"ref_22","doi-asserted-by":"crossref","unstructured":"Merchan, L., and Nemenman, I. (2015). On the sufficiency of pairwise interactions in maximum entropy models of biological networks.","DOI":"10.1007\/s10955-016-1456-5"},{"key":"ref_23","doi-asserted-by":"crossref","first-page":"14259","DOI":"10.1073\/pnas.1203021109","article-title":"Sparse code of conflict in a primate society","volume":"109","author":"Daniels","year":"2012","journal-title":"Proc. Natl. Acad. Sci. USA"},{"key":"ref_24","doi-asserted-by":"crossref","first-page":"275","DOI":"10.1007\/s10955-015-1253-6","article-title":"Statistical mechanics of the US Supreme Court","volume":"160","author":"Lee","year":"2013","journal-title":"J. Stat. Phys."},{"key":"ref_25","doi-asserted-by":"crossref","first-page":"133","DOI":"10.1016\/S0019-9958(78)90275-9","article-title":"Nonnegative entropy measures of multivariate symmetric correlations","volume":"36","author":"Sun","year":"1978","journal-title":"Inf. Control"},{"key":"ref_26","doi-asserted-by":"crossref","first-page":"407","DOI":"10.1140\/epjb\/e2008-00134-9","article-title":"How should complexity scale with system size?","volume":"63","author":"Olbrich","year":"2008","journal-title":"Eur. Phys. J. B"},{"key":"ref_27","doi-asserted-by":"crossref","first-page":"25","DOI":"10.1063\/1.1530990","article-title":"Regularities unseen, randomness observed: Levels of entropy convergence","volume":"13","author":"Crutchfield","year":"2003","journal-title":"Chaos"},{"key":"ref_28","unstructured":"Rosas, F., Ntranos, V., Ellison, C.J., Verhelst, M., and Pollin, S. (2015, January 6\u20137). Understanding high-order correlations using a synergy-based decomposition of the total entropy. Proceedings of the 5th Joint WIC\/IEEE Symposium on Information Theory and Signal Processing in the Benelux, Brussels, Belgium."},{"key":"ref_29","doi-asserted-by":"crossref","first-page":"466","DOI":"10.1109\/18.79902","article-title":"A new outlook on Shannon\u2019s information measures","volume":"37","author":"Yeung","year":"1991","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_30","doi-asserted-by":"crossref","first-page":"47","DOI":"10.1142\/S0219525904000068","article-title":"Multiscale complexity\/entropy","volume":"7","year":"2004","journal-title":"Adv. Complex Syst."},{"key":"ref_31","unstructured":"Bell, A.J. (2003, January 1\u20134). The co-information lattice. Proceedings of the 4th International Symposium on Independent Component Analysis and Blind Signal Separation (ICA 2003), Nara, Japan."},{"key":"ref_32","doi-asserted-by":"crossref","first-page":"97","DOI":"10.1007\/BF02289159","article-title":"Multivariate information transmission","volume":"19","author":"McGill","year":"1954","journal-title":"Psychometrika"},{"key":"ref_33","doi-asserted-by":"crossref","unstructured":"James, R.G., Ellison, C.J., and Crutchfield, J.P. (2011). Anatomy of a bit: Information in a time series observation. Chaos Interdiscip. J Nonlinear Sci., 21.","DOI":"10.1063\/1.3637494"},{"key":"ref_34","doi-asserted-by":"crossref","first-page":"159","DOI":"10.1007\/978-3-642-53734-9_6","article-title":"Quantifying Synergistic Mutual Information","volume":"Volume 9","author":"Prokopenko","year":"2014","journal-title":"Guided Self-Organization: Inception"},{"key":"ref_35","doi-asserted-by":"crossref","unstructured":"Griffith, V. (2014). Quantifying synergistic information. [Ph.D. Thesis, California Institute of Technology].","DOI":"10.1007\/978-3-642-53734-9_6"},{"key":"ref_36","doi-asserted-by":"crossref","first-page":"5357","DOI":"10.1109\/TIT.2015.2462848","article-title":"Justification of Logarithmic Loss via the Benefit of Side Information","volume":"61","author":"Jiao","year":"2015","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_37","doi-asserted-by":"crossref","unstructured":"Harder, M., Salge, C., and Polani, D. (2013). Bivariate measure of redundant information. Phys. Rev. E, 87.","DOI":"10.1103\/PhysRevE.87.012130"},{"key":"ref_38","doi-asserted-by":"crossref","first-page":"1985","DOI":"10.3390\/e16041985","article-title":"Intersection information based on common randomness","volume":"16","author":"Griffith","year":"2014","journal-title":"Entropy"},{"key":"ref_39","doi-asserted-by":"crossref","first-page":"2161","DOI":"10.3390\/e16042161","article-title":"Quantifying unique information","volume":"16","author":"Bertschinger","year":"2014","journal-title":"Entropy"},{"key":"ref_40","doi-asserted-by":"crossref","unstructured":"Bertschinger, N., Rauh, J., Olbrich, E., and Jost, J. (2012, January 3\u20137). Shared information\u2014New insights and problems in decomposing information in complex systems. Proceedings of the European Conference on Complex Systems 2012, Brussels, Belgium.","DOI":"10.1007\/978-3-319-00395-5_35"},{"key":"ref_41","doi-asserted-by":"crossref","unstructured":"Barrett, A.B. (2015). Exploration of synergistic and redundant information sharing in static and dynamical Gaussian systems. Phys. Rev. E, 91.","DOI":"10.1103\/PhysRevE.91.052802"},{"key":"ref_42","doi-asserted-by":"crossref","first-page":"47","DOI":"10.2307\/3002000","article-title":"Limitations of the application of fourfold table analysis to hospital data","volume":"2","author":"Berkson","year":"1946","journal-title":"Biom. Bull."},{"key":"ref_43","unstructured":"Kim, J., and Pearl, J. (1983, January 8\u201312). A computational model for causal and diagnostic reasoning in inference systems. Proceedings of the Eighth International Joint Conference on Artificial Intelligence (IJCAI), Karlsruhe, Germany."},{"key":"ref_44","doi-asserted-by":"crossref","unstructured":"Bloch, M., and Barros, J. (2011). Physical-Layer Security: From Information Theory to Security Engineering, Cambridge University Press.","DOI":"10.1017\/CBO9780511977985"},{"key":"ref_45","doi-asserted-by":"crossref","first-page":"656","DOI":"10.1002\/j.1538-7305.1949.tb00928.x","article-title":"Communication theory of secrecy systems*","volume":"28","author":"Shannon","year":"1949","journal-title":"Bell Syst. Tech. J."},{"key":"ref_46","doi-asserted-by":"crossref","first-page":"1204","DOI":"10.1109\/18.850663","article-title":"Network information flow","volume":"46","author":"Ahlswede","year":"2000","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_47","doi-asserted-by":"crossref","first-page":"371","DOI":"10.1109\/TIT.2002.807285","article-title":"Linear network coding","volume":"49","author":"Li","year":"2003","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_48","doi-asserted-by":"crossref","first-page":"497","DOI":"10.1109\/TNET.2008.923722","article-title":"XORs in the air: Practical wireless network coding","volume":"16","author":"Katti","year":"2008","journal-title":"IEEE\/ACM Trans. Netw."},{"key":"ref_49","unstructured":"Landau, L., and Lifshitz, E. (1970). Statistical Physics, Pergamon Press. [2nd ed.]."},{"key":"ref_50","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1561\/2200000001","article-title":"Graphical models, exponential families, and variational inference","volume":"1","author":"Wainwright","year":"2008","journal-title":"Found. Trends Mach. Learn."},{"key":"ref_51","doi-asserted-by":"crossref","first-page":"937","DOI":"10.1080\/00029890.1987.12000742","article-title":"An introduction to the Ising model","volume":"94","author":"Cipra","year":"1987","journal-title":"Am. Math. Mon."},{"key":"ref_52","unstructured":"Sayed, A.H. (2011). Adaptive Filters, John Wiley & Sons."},{"key":"ref_53","doi-asserted-by":"crossref","unstructured":"El Gamal, A., and Kim, Y.H. (2011). Network Information Theory, Cambridge University Press.","DOI":"10.1017\/CBO9781139030687"}],"container-title":["Entropy"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1099-4300\/18\/2\/38\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T19:18:20Z","timestamp":1760210300000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1099-4300\/18\/2\/38"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2016,1,26]]},"references-count":53,"journal-issue":{"issue":"2","published-online":{"date-parts":[[2016,2]]}},"alternative-id":["e18020038"],"URL":"https:\/\/doi.org\/10.3390\/e18020038","relation":{},"ISSN":["1099-4300"],"issn-type":[{"value":"1099-4300","type":"electronic"}],"subject":[],"published":{"date-parts":[[2016,1,26]]}}}