{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T02:16:31Z","timestamp":1760148991298,"version":"build-2065373602"},"reference-count":43,"publisher":"MDPI AG","issue":"7","license":[{"start":{"date-parts":[[2023,6,25]],"date-time":"2023-06-25T00:00:00Z","timestamp":1687651200000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"DOI":"10.13039\/501100001871","name":"Funda\u00e7\u00e3o para a Ci\u00eancia e Tecnologia","doi-asserted-by":"publisher","award":["SFRH\/BD\/145472\/2019","UIDB\/50008\/2020","C645008882-00000055"],"award-info":[{"award-number":["SFRH\/BD\/145472\/2019","UIDB\/50008\/2020","C645008882-00000055"]}],"id":[{"id":"10.13039\/501100001871","id-type":"DOI","asserted-by":"publisher"}]},{"name":"Instituto de Telecomunica\u00e7\u00f5es; Portuguese Recovery and Resilience Plan","award":["SFRH\/BD\/145472\/2019","UIDB\/50008\/2020","C645008882-00000055"],"award-info":[{"award-number":["SFRH\/BD\/145472\/2019","UIDB\/50008\/2020","C645008882-00000055"]}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Entropy"],"abstract":"<jats:p>The partial information decomposition (PID) framework is concerned with decomposing the information that a set of random variables has with respect to a target variable into three types of components: redundant, synergistic, and unique. Classical information theory alone does not provide a unique way to decompose information in this manner, and additional assumptions have to be made. Recently, Kolchinsky proposed a new general axiomatic approach to obtain measures of redundant information based on choosing an order relation between information sources (equivalently, order between communication channels). In this paper, we exploit this approach to introduce three new measures of redundant information (and the resulting decompositions) based on well-known preorders between channels, contributing to the enrichment of the PID landscape. We relate the new decompositions to existing ones, study several of their properties, and provide examples illustrating their novelty. As a side result, we prove that any preorder that satisfies Kolchinsky\u2019s axioms yields a decomposition that meets the axioms originally introduced by Williams and Beer when they first proposed PID.<\/jats:p>","DOI":"10.3390\/e25070975","type":"journal-article","created":{"date-parts":[[2023,6,26]],"date-time":"2023-06-26T05:39:13Z","timestamp":1687757953000},"page":"975","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":1,"title":["Orders between Channels and Implications for Partial Information Decomposition"],"prefix":"10.3390","volume":"25","author":[{"ORCID":"https:\/\/orcid.org\/0000-0002-6723-9468","authenticated-orcid":false,"given":"Andr\u00e9 F. C.","family":"Gomes","sequence":"first","affiliation":[{"name":"Instituto de Telecomunica\u00e7\u00f5es and LUMLIS (Lisbon ELLIS Unit), Instituto Superior T\u00e9cnico, Universidade de Lisboa, 1049-001 Lisboa, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-0970-7745","authenticated-orcid":false,"given":"M\u00e1rio A. T.","family":"Figueiredo","sequence":"additional","affiliation":[{"name":"Instituto de Telecomunica\u00e7\u00f5es and LUMLIS (Lisbon ELLIS Unit), Instituto Superior T\u00e9cnico, Universidade de Lisboa, 1049-001 Lisboa, Portugal"}]}],"member":"1968","published-online":{"date-parts":[[2023,6,25]]},"reference":[{"key":"ref_1","unstructured":"Williams, P., and Beer, R. (2010). Nonnegative decomposition of multivariate information. arXiv."},{"key":"ref_2","doi-asserted-by":"crossref","unstructured":"Lizier, J., Flecker, B., and Williams, P. (2013, January 16\u201319). Towards a synergy-based approach to measuring information modification. Proceedings of the 2013 IEEE Symposium on Artificial Life (ALIFE), Singapore.","DOI":"10.1109\/ALIFE.2013.6602430"},{"key":"ref_3","doi-asserted-by":"crossref","unstructured":"Wibral, M., Finn, C., Wollstadt, P., Lizier, J., and Priesemann, V. (2017). Quantifying information modification in developing neural networks via partial information decomposition. Entropy, 19.","DOI":"10.3390\/e19090494"},{"key":"ref_4","doi-asserted-by":"crossref","unstructured":"Rauh, J. (2017). Secret sharing and shared information. Entropy, 19.","DOI":"10.3390\/e19110601"},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"45","DOI":"10.1007\/s10827-010-0262-3","article-title":"Transfer entropy\u2014A model-free measure of effective connectivity for the neurosciences","volume":"30","author":"Vicente","year":"2011","journal-title":"J. Comput. Neurosci."},{"key":"ref_6","doi-asserted-by":"crossref","first-page":"17681","DOI":"10.1038\/srep17681","article-title":"Tracing the flow of perceptual features in an algorithmic brain network","volume":"5","author":"Ince","year":"2015","journal-title":"Sci. Rep."},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"24456","DOI":"10.1038\/srep24456","article-title":"Control of complex networks requires both structure and dynamics","volume":"6","author":"Gates","year":"2016","journal-title":"Sci. Rep."},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"384","DOI":"10.1162\/netn_a_00069","article-title":"Computation is concentrated in rich clubs of local cortical networks","volume":"3","author":"Faber","year":"2019","journal-title":"Netw. Neurosci."},{"key":"ref_9","unstructured":"James, R., Ayala, B., Zakirov, B., and Crutchfield, J. (2018). Modes of information flow. arXiv."},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"42","DOI":"10.1111\/j.1467-9469.2011.00774.x","article-title":"Shannon Entropy and Mutual Information for Multivariate Skew-Elliptical Distributions","volume":"40","author":"Genton","year":"2013","journal-title":"Scand. J. Stat."},{"key":"ref_11","unstructured":"Cover, T. (1999). Elements of Information Theory, John Wiley & Sons."},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"20210110","DOI":"10.1098\/rspa.2021.0110","article-title":"Bits and pieces: Understanding information decomposition from part-whole relationships and formal logic","volume":"477","author":"Gutknecht","year":"2021","journal-title":"Proc. R. Soc. A"},{"key":"ref_13","doi-asserted-by":"crossref","first-page":"012130","DOI":"10.1103\/PhysRevE.87.012130","article-title":"Bivariate measure of redundant information","volume":"87","author":"Harder","year":"2013","journal-title":"Phys. Rev. E"},{"key":"ref_14","doi-asserted-by":"crossref","first-page":"2161","DOI":"10.3390\/e16042161","article-title":"Quantifying unique information","volume":"16","author":"Bertschinger","year":"2014","journal-title":"Entropy"},{"key":"ref_15","unstructured":"Griffith, V., and Koch, C. (2014). Guided Self-Organization: Inception, Springer."},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"014002","DOI":"10.1088\/1751-8121\/aaed53","article-title":"Unique information via dependency constraints","volume":"52","author":"James","year":"2018","journal-title":"J. Phys. A Math. Theor."},{"key":"ref_17","doi-asserted-by":"crossref","unstructured":"Chicharro, D., and Panzeri, S. (2017). Synergy and redundancy in dual decompositions of mutual information gain and information loss. Entropy, 19.","DOI":"10.3390\/e19020071"},{"key":"ref_18","doi-asserted-by":"crossref","unstructured":"Bertschinger, N., Rauh, J., Olbrich, E., and Jost, J. (2012, January 2\u20137). Shared information\u2014New insights and problems in decomposing information in complex systems. Proceedings of the European Conference on Complex Systems 2012, Brussels, Belgium.","DOI":"10.1007\/978-3-319-00395-5_35"},{"key":"ref_19","doi-asserted-by":"crossref","unstructured":"Rauh, J., Banerjee, P., Olbrich, E., Jost, J., Bertschinger, N., and Wolpert, D. (2017). Coarse-graining and the Blackwell order. Entropy, 19.","DOI":"10.3390\/e19100527"},{"key":"ref_20","doi-asserted-by":"crossref","unstructured":"Ince, R. (2017). Measuring multivariate redundant information with pointwise common change in surprisal. Entropy, 19.","DOI":"10.3390\/e19070318"},{"key":"ref_21","doi-asserted-by":"crossref","unstructured":"Kolchinsky, A. (2022). A Novel Approach to the Partial Information Decomposition. Entropy, 24.","DOI":"10.3390\/e24030403"},{"key":"ref_22","unstructured":"Csiszr, I., and Elias, P. (1977). Topics in Information Theory, North-Holland Pub. Co."},{"key":"ref_23","doi-asserted-by":"crossref","unstructured":"Am\u00e9rico, A., Khouzani, A., and Malacaria, P. (2021). Channel-Supermodular Entropies: Order Theory and an Application to Query Anonymization. Entropy, 24.","DOI":"10.3390\/e24010039"},{"key":"ref_24","unstructured":"Cohen, J., Kempermann, J., and Zbaganu, G. (1998). Comparisons of Stochastic Matrices with Applications in Information Theory, Statistics, Economics and Population, Springer Science & Business Media."},{"key":"ref_25","doi-asserted-by":"crossref","first-page":"265","DOI":"10.1214\/aoms\/1177729032","article-title":"Equivalent comparisons of experiments","volume":"24","author":"Blackwell","year":"1953","journal-title":"Ann. Math. Stat."},{"key":"ref_26","doi-asserted-by":"crossref","unstructured":"Makur, A., and Polyanskiy, Y. (2017, January 25\u201330). Less noisy domination by symmetric channels. Proceedings of the 2017 IEEE International Symposium on Information Theory (ISIT), Aachen, Germany.","DOI":"10.1109\/ISIT.2017.8006972"},{"key":"ref_27","doi-asserted-by":"crossref","unstructured":"Csisz\u00e1r, I., and K\u00f6rner, J. (2011). Information Theory: Coding Theorems for Discrete Memoryless Systems, Cambridge University Press.","DOI":"10.1017\/CBO9780511921889"},{"key":"ref_28","doi-asserted-by":"crossref","first-page":"1355","DOI":"10.1002\/j.1538-7305.1975.tb02040.x","article-title":"The wire-tap channel","volume":"54","author":"Wyner","year":"1975","journal-title":"Bell Syst. Tech. J."},{"key":"ref_29","doi-asserted-by":"crossref","unstructured":"Bassi, G., Piantanida, P., and Shamai, S. (2019). The secret key capacity of a class of noisy channels with correlated sources. Entropy, 21.","DOI":"10.3390\/e21080732"},{"key":"ref_30","doi-asserted-by":"crossref","first-page":"166","DOI":"10.1109\/TIT.1979.1056029","article-title":"The capacity of a class of broadcast channels","volume":"25","author":"Gamal","year":"1979","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_31","doi-asserted-by":"crossref","first-page":"181","DOI":"10.1093\/logcom\/exi009","article-title":"Quantitative information flow, relations and polymorphic types","volume":"15","author":"Clark","year":"2005","journal-title":"J. Log. Comput."},{"key":"ref_32","doi-asserted-by":"crossref","first-page":"1985","DOI":"10.3390\/e16041985","article-title":"Intersection information based on common randomness","volume":"16","author":"Griffith","year":"2014","journal-title":"Entropy"},{"key":"ref_33","doi-asserted-by":"crossref","first-page":"052802","DOI":"10.1103\/PhysRevE.91.052802","article-title":"Exploration of synergistic and redundant information sharing in static and dynamical Gaussian systems","volume":"91","author":"Barrett","year":"2015","journal-title":"Phys. Rev. E"},{"key":"ref_34","doi-asserted-by":"crossref","first-page":"325","DOI":"10.1088\/0954-898X_10_4_303","article-title":"How to measure the information gained from one symbol","volume":"10","author":"DeWeese","year":"1999","journal-title":"Netw. Comput. Neural Syst."},{"key":"ref_35","doi-asserted-by":"crossref","unstructured":"Rauh, J., Banerjee, P., Olbrich, E., Jost, J., and Bertschinger, N. (2017). On extractable shared information. Entropy, 19.","DOI":"10.3390\/e19070328"},{"key":"ref_36","doi-asserted-by":"crossref","unstructured":"Rauh, J., Bertschinger, N., Olbrich, E., and Jost, J. (July, January 29). Reconsidering unique information: Towards a multivariate information decomposition. Proceedings of the 2014 IEEE International Symposium on Information Theory, Honolulu, HI, USA.","DOI":"10.1109\/ISIT.2014.6875230"},{"key":"ref_37","first-page":"149","article-title":"Common information is far less than mutual information","volume":"2","year":"1973","journal-title":"Probl. Control Inf. Theory"},{"key":"ref_38","doi-asserted-by":"crossref","first-page":"4644","DOI":"10.3390\/e17074644","article-title":"Quantifying redundant information in predicting a target random variable","volume":"17","author":"Griffith","year":"2015","journal-title":"Entropy"},{"key":"ref_39","doi-asserted-by":"crossref","unstructured":"Finn, C., and Lizier, J. (2018). Pointwise Partial Information Decomposition Using the Specificity and Ambiguity Lattices. Entropy, 20.","DOI":"10.3390\/e20040297"},{"key":"ref_40","doi-asserted-by":"crossref","first-page":"738","DOI":"10.21105\/joss.00738","article-title":"\u201cdit\u201d: A Python package for discrete information theory","volume":"3","author":"James","year":"2018","journal-title":"J. Open Source Softw."},{"key":"ref_41","unstructured":"Massey, J. (July, January 27). Guessing and entropy. Proceedings of the 1994 IEEE International Symposium on Information Theory, Trondheim, Norway."},{"key":"ref_42","doi-asserted-by":"crossref","first-page":"479","DOI":"10.1007\/BF01016429","article-title":"Possible generalization of Boltzmann-Gibbs statistics","volume":"52","author":"Tsallis","year":"1988","journal-title":"J. Stat. Phys."},{"key":"ref_43","doi-asserted-by":"crossref","first-page":"673","DOI":"10.1038\/nature03909","article-title":"Partial quantum information","volume":"436","author":"Horodecki","year":"2005","journal-title":"Nature"}],"container-title":["Entropy"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1099-4300\/25\/7\/975\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,10]],"date-time":"2025-10-10T20:00:12Z","timestamp":1760126412000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1099-4300\/25\/7\/975"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2023,6,25]]},"references-count":43,"journal-issue":{"issue":"7","published-online":{"date-parts":[[2023,7]]}},"alternative-id":["e25070975"],"URL":"https:\/\/doi.org\/10.3390\/e25070975","relation":{},"ISSN":["1099-4300"],"issn-type":[{"type":"electronic","value":"1099-4300"}],"subject":[],"published":{"date-parts":[[2023,6,25]]}}}