{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,1,13]],"date-time":"2026-01-13T00:02:18Z","timestamp":1768262538807,"version":"3.49.0"},"reference-count":67,"publisher":"MDPI AG","issue":"3","license":[{"start":{"date-parts":[[2018,3,5]],"date-time":"2018-03-05T00:00:00Z","timestamp":1520208000000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Entropy"],"abstract":"<jats:p>Understanding how different information sources together transmit information is crucial in many domains. For example, understanding the neural code requires characterizing how different neurons contribute unique, redundant, or synergistic pieces of information about sensory or behavioral variables. Williams and Beer (2010) proposed a partial information decomposition (PID) that separates the mutual information that a set of sources contains about a set of targets into nonnegative terms interpretable as these pieces. Quantifying redundancy requires assigning an identity to different information pieces, to assess when information is common across sources. Harder et al. (2013) proposed an identity axiom that imposes necessary conditions to quantify qualitatively common information. However, Bertschinger et al. (2012) showed that, in a counterexample with deterministic target-source dependencies, the identity axiom is incompatible with ensuring PID nonnegativity. Here, we study systematically the consequences of information identity criteria that assign identity based on associations between target and source variables resulting from deterministic dependencies. We show how these criteria are related to the identity axiom and to previously proposed redundancy measures, and we characterize how they lead to negative PID terms. This constitutes a further step to more explicitly address the role of information identity in the quantification of redundancy. The implications for studying neural coding are discussed.<\/jats:p>","DOI":"10.3390\/e20030169","type":"journal-article","created":{"date-parts":[[2018,3,6]],"date-time":"2018-03-06T07:37:25Z","timestamp":1520321845000},"page":"169","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":6,"title":["The Identity of Information: How Deterministic Dependencies Constrain Information Synergy and Redundancy"],"prefix":"10.3390","volume":"20","author":[{"given":"Daniel","family":"Chicharro","sequence":"first","affiliation":[{"name":"Department of Neurobiology, Harvard Medical School, Boston, MA 02115, USA"},{"name":"Neural Computation Laboratory, Center for Neuroscience and Cognitive Systems@UniTn, Istituto Italiano di Tecnologia, Rovereto (TN) 38068, Italy"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-4506-0600","authenticated-orcid":false,"given":"Giuseppe","family":"Pica","sequence":"additional","affiliation":[{"name":"Neural Computation Laboratory, Center for Neuroscience and Cognitive Systems@UniTn, Istituto Italiano di Tecnologia, Rovereto (TN) 38068, Italy"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-1700-8909","authenticated-orcid":false,"given":"Stefano","family":"Panzeri","sequence":"additional","affiliation":[{"name":"Neural Computation Laboratory, Center for Neuroscience and Cognitive Systems@UniTn, Istituto Italiano di Tecnologia, Rovereto (TN) 38068, Italy"}]}],"member":"1968","published-online":{"date-parts":[[2018,3,5]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"1701","DOI":"10.1109\/18.930911","article-title":"Information geometry on hierarchy of probability distributions","volume":"47","author":"Amari","year":"2001","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"238701","DOI":"10.1103\/PhysRevLett.91.238701","article-title":"Network information and connected correlations","volume":"91","author":"Schneidman","year":"2003","journal-title":"Phys. Rev. Lett."},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"713","DOI":"10.1016\/j.neunet.2010.05.008","article-title":"Information-theoretic methods for studying population codes","volume":"23","author":"Ince","year":"2010","journal-title":"Neural Netw."},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"1001","DOI":"10.1098\/rspb.1999.0736","article-title":"Correlations and the encoding of information in the nervous system","volume":"266","author":"Panzeri","year":"1999","journal-title":"Proc. R. Soc. Lond. B Biol. Sci."},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"999","DOI":"10.1162\/NECO_a_00588","article-title":"A Causal Perspective on the Analysis of Signal and Noise Correlations and Their Role in Population Coding","volume":"26","author":"Chicharro","year":"2014","journal-title":"Neural Comput."},{"key":"ref_6","doi-asserted-by":"crossref","first-page":"119","DOI":"10.1007\/s10827-013-0458-4","article-title":"Synergy, redundancy, and multivariate information measures: An experimentalist\u2019s perspective","volume":"36","author":"Timme","year":"2014","journal-title":"J. Comput. Neurosci."},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"302","DOI":"10.1111\/j.1749-6632.2008.03757.x","article-title":"Inference of regulatory gene interactions from expression data using three-way mutual information","volume":"1158","author":"Watkinson","year":"2009","journal-title":"Ann. N. Y. Acad. Sci."},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"141","DOI":"10.1038\/nrg2499","article-title":"The evolution of hierarchical gene regulatory networks","volume":"10","author":"Erwin","year":"2009","journal-title":"Nat. Rev. Genet."},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"250","DOI":"10.1016\/j.gene.2016.05.029","article-title":"Construction of synergy networks from gene expression data related to disease","volume":"590","author":"Chatterjee","year":"2016","journal-title":"Gene"},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"1015","DOI":"10.1016\/j.mri.2008.02.019","article-title":"On the use of information theory for the analysis of the relationship between neural and imaging signals","volume":"26","author":"Panzeri","year":"2008","journal-title":"Magn. Reson. Imaging"},{"key":"ref_11","doi-asserted-by":"crossref","first-page":"138101","DOI":"10.1103\/PhysRevLett.102.138101","article-title":"Prediction of Spatiotemporal Patterns of Neural Activity from Pairwise Correlations","volume":"102","author":"Marre","year":"2009","journal-title":"Phys. Rev. Lett."},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"2488","DOI":"10.1109\/TBME.2016.2569823","article-title":"An Information-Theoretic Framework to Map the Spatiotemporal Dynamics of the Scalp Electroencephalogram","volume":"63","author":"Faes","year":"2016","journal-title":"IEEE Trans. Biomed. Eng."},{"key":"ref_13","doi-asserted-by":"crossref","first-page":"18720","DOI":"10.1073\/pnas.1107583108","article-title":"Inferring the structure and dynamics of interactions in schooling fish","volume":"108","author":"Katz","year":"2011","journal-title":"Proc. Natl. Acad. Sci. USA"},{"key":"ref_14","doi-asserted-by":"crossref","first-page":"1802","DOI":"10.1098\/rstb.2011.0214","article-title":"Multiple time-scales and the developmental dynamics of social systems","volume":"367","author":"Flack","year":"2012","journal-title":"Philos. Trans. R. Soc. B Biol. Sci."},{"key":"ref_15","doi-asserted-by":"crossref","first-page":"125","DOI":"10.1007\/s12064-011-0140-1","article-title":"Information-driven self-organization: The dynamical system approach to autonomous robot behavior","volume":"131","author":"Ay","year":"2012","journal-title":"Theory Biosci."},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"5195","DOI":"10.1523\/JNEUROSCI.5319-04.2005","article-title":"Synergy, Redundancy, and Independence in Population Codes, Revisited","volume":"25","author":"Latham","year":"2005","journal-title":"J. Neurosci."},{"key":"ref_17","doi-asserted-by":"crossref","first-page":"63","DOI":"10.1007\/s12064-013-0186-3","article-title":"Robustness, canalyzing functions and systems design","volume":"133","author":"Rauh","year":"2014","journal-title":"Theory Biosci."},{"key":"ref_18","unstructured":"Tishby, N., Pereira, F.C., and Bialek, W. (1999, January 22\u201324). The Information Bottleneck Method. Proceedings of the 37th Annual Allerton Conference on Communication, Control, and Computing, Monticello, IL, USA."},{"key":"ref_19","doi-asserted-by":"crossref","first-page":"358","DOI":"10.1038\/nrn1888","article-title":"Neural correlations, population coding and computation","volume":"7","author":"Averbeck","year":"2006","journal-title":"Nat. Rev. Neurosci."},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"491","DOI":"10.1016\/j.neuron.2016.12.036","article-title":"Cracking the neural code for sensory perception by combining statistics, intervention and behavior","volume":"93","author":"Panzeri","year":"2017","journal-title":"Neuron"},{"key":"ref_21","doi-asserted-by":"crossref","unstructured":"Wibral, M., Vicente, R., and Lizier, J.T. (2014). Directed Information Measures in Neuroscience, Springer.","DOI":"10.1007\/978-3-642-54474-3"},{"key":"ref_22","doi-asserted-by":"crossref","unstructured":"Timme, N.M., Ito, S., Myroshnychenko, M., Nigam, S., Shimono, M., Yeh, F.C., Hottowy, P., Litke, A.M., and Beggs, J.M. (2016). High-Degree Neurons Feed Cortical Computations. PLoS Comput. Biol., 12.","DOI":"10.1371\/journal.pcbi.1004858"},{"key":"ref_23","doi-asserted-by":"crossref","first-page":"111","DOI":"10.1016\/j.tins.2009.12.001","article-title":"Sensory neural codes using multiplexed temporal scales","volume":"33","author":"Panzeri","year":"2010","journal-title":"Trends Neurosci."},{"key":"ref_24","doi-asserted-by":"crossref","first-page":"162","DOI":"10.1016\/j.tics.2015.01.002","article-title":"Neural population coding: Combining insights from microscopic and mass signals","volume":"19","author":"Panzeri","year":"2015","journal-title":"Trends Cogn. Sci."},{"key":"ref_25","doi-asserted-by":"crossref","first-page":"339","DOI":"10.1016\/j.neuroimage.2011.03.058","article-title":"Effective connectivity: Influence, causality and biophysical modeling","volume":"58","author":"Roebroeck","year":"2011","journal-title":"Neuroimage"},{"key":"ref_26","doi-asserted-by":"crossref","first-page":"45","DOI":"10.1007\/s10827-010-0262-3","article-title":"Transfer entropy: A model-free measure of effective connectivity for the neurosciences","volume":"30","author":"Vicente","year":"2011","journal-title":"J. Comput. Neurosci."},{"key":"ref_27","doi-asserted-by":"crossref","first-page":"17681","DOI":"10.1038\/srep17681","article-title":"Tracing the Flow of Perceptual Features in an Algorithmic Brain Network","volume":"5","author":"Ince","year":"2015","journal-title":"Sci. Rep."},{"key":"ref_28","doi-asserted-by":"crossref","first-page":"430","DOI":"10.1038\/nrn3963","article-title":"Rethinking segregation and integration: Contributions of whole-brain modelling","volume":"16","author":"Deco","year":"2015","journal-title":"Nat. Rev. Neurosci."},{"key":"ref_29","doi-asserted-by":"crossref","first-page":"97","DOI":"10.1007\/BF02289159","article-title":"Multivariate information transmission","volume":"19","author":"McGill","year":"1954","journal-title":"Psychometrika"},{"key":"ref_30","unstructured":"Bell, A.J. (2003, January 1\u20134). The co-information lattice. Proceedings of the 4th International Symposium Independent Component Analysis and Blind Source Separation, Nara, Japan."},{"key":"ref_31","doi-asserted-by":"crossref","first-page":"3501","DOI":"10.3390\/e17053501","article-title":"Information decomposition and synergy","volume":"17","author":"Olbrich","year":"2015","journal-title":"Entropy"},{"key":"ref_32","doi-asserted-by":"crossref","first-page":"35","DOI":"10.3389\/frobt.2015.00035","article-title":"Hierarchical quantification of synergy in channels","volume":"2","author":"Perrone","year":"2016","journal-title":"Front. Robot. AI"},{"key":"ref_33","unstructured":"Williams, P.L., and Beer, R.D. (arXiv, 2010). Nonnegative Decomposition of Multivariate Information, arXiv."},{"key":"ref_34","unstructured":"Williams, P.L. (2011). Information Dynamics: Its Theory and Application to Embodied Cognitive Systems. [PhD. Thesis, Indiana University]."},{"key":"ref_35","doi-asserted-by":"crossref","first-page":"012130","DOI":"10.1103\/PhysRevE.87.012130","article-title":"Bivariate measure of redundant information","volume":"87","author":"Harder","year":"2013","journal-title":"Phys. Rev. E"},{"key":"ref_36","doi-asserted-by":"crossref","first-page":"2161","DOI":"10.3390\/e16042161","article-title":"Quantifying unique information","volume":"16","author":"Bertschinger","year":"2014","journal-title":"Entropy"},{"key":"ref_37","doi-asserted-by":"crossref","unstructured":"Griffith, V., and Koch, C. (arXiv, 2013). Quantifying synergistic mutual information, arXiv.","DOI":"10.1007\/978-3-642-53734-9_6"},{"key":"ref_38","doi-asserted-by":"crossref","unstructured":"Ince, R.A.A. (2017). Measuring multivariate redundant information with pointwise common change in surprisal. Entropy, 19.","DOI":"10.3390\/e19070318"},{"key":"ref_39","doi-asserted-by":"crossref","unstructured":"Rauh, J., Banerjee, P.K., Olbrich, E., Jost, J., and Bertschinger, N. (2017). On Extractable Shared Information. Entropy, 19.","DOI":"10.3390\/e19070328"},{"key":"ref_40","unstructured":"Chicharro, D. (arXiv, 2017). Quantifying multivariate redundancy with maximum entropy decompositions of mutual information, arXiv."},{"key":"ref_41","doi-asserted-by":"crossref","unstructured":"Rauh, J. (2017). Secret Sharing and shared information. Entropy, 19.","DOI":"10.3390\/e19110601"},{"key":"ref_42","unstructured":"Gilbert, T., Kirkilionis, M., and Nicolis, G. (2012). Shared Information\u2014New Insights and Problems in Decomposing Information in Complex Systems. Proceedings of the European Conference on Complex Systems 2012;, Springer."},{"key":"ref_43","doi-asserted-by":"crossref","unstructured":"James, R.G., Emenheiser, J., and Crutchfield, J.P. (arXiv, 2017). Unique Information via Dependency Constraints, arXiv.","DOI":"10.1088\/1751-8121\/aaed53"},{"key":"ref_44","doi-asserted-by":"crossref","unstructured":"Rauh, J., Bertschinger, N., Olbrich, E., and Jost, J. (July, January 29). Reconsidering unique information: Towards a multivariate information decomposition. Proceedings of the 2014 IEEE International Symposium on Information Theory (ISIT), Honolulu, HI, USA.","DOI":"10.1109\/ISIT.2014.6875230"},{"key":"ref_45","unstructured":"Cover, T.M., and Thomas, J.A. (2006). Elements of Information Theory, John Wiley and Sons. [2nd ed.]."},{"key":"ref_46","doi-asserted-by":"crossref","unstructured":"Chicharro, D., and Panzeri, S. (2017). Synergy and Redundancy in Dual Decompositions of Mutual Information Gain and Information Loss. Entropy, 19.","DOI":"10.3390\/e19020071"},{"key":"ref_47","doi-asserted-by":"crossref","unstructured":"Pica, G., Piasini, E., Chicharro, D., and Panzeri, S. (2017). Invariant components of synergy, redundancy, and unique information among three variables. Entropy, 19.","DOI":"10.3390\/e19090451"},{"key":"ref_48","doi-asserted-by":"crossref","first-page":"1985","DOI":"10.3390\/e16041985","article-title":"Intersection Information based on Common Randomness","volume":"16","author":"Griffith","year":"2014","journal-title":"Entropy"},{"key":"ref_49","unstructured":"Banerjee, P.K., and Griffith, V. (arXiv, 2015). Synergy, redundancy, and common information, arXiv."},{"key":"ref_50","doi-asserted-by":"crossref","first-page":"052802","DOI":"10.1103\/PhysRevE.91.052802","article-title":"Exploration of synergistic and redundant information sharing in static and dynamical Gaussian systems","volume":"91","author":"Barrett","year":"2015","journal-title":"Phys. Rev. E"},{"key":"ref_51","doi-asserted-by":"crossref","unstructured":"Faes, L., Marinazzo, D., and Stramaglia, S. (2017). Multiscale Information Decomposition: Exact Computation for Multivariate Gaussian Processes. Entropy, 19.","DOI":"10.3390\/e19080408"},{"key":"ref_52","doi-asserted-by":"crossref","unstructured":"James, R.G., and Crutchfield, J.P. (2017). Multivariate Dependence Beyond Shannon Information. Entropy, 19.","DOI":"10.3390\/e19100531"},{"key":"ref_53","doi-asserted-by":"crossref","first-page":"379","DOI":"10.1002\/j.1538-7305.1948.tb01338.x","article-title":"A mathematical theory of communication","volume":"27","author":"Shannon","year":"1948","journal-title":"Bell. Syst. Tech. J."},{"key":"ref_54","unstructured":"Kullback, S. (1959). Information Theory and Statistics, Dover."},{"key":"ref_55","doi-asserted-by":"crossref","first-page":"5","DOI":"10.3389\/frobt.2015.00005","article-title":"Bits from brains for biologically inspired computing","volume":"2","author":"Wibral","year":"2015","journal-title":"Front. Robot. AI"},{"key":"ref_56","doi-asserted-by":"crossref","first-page":"741","DOI":"10.1162\/0899766053429435","article-title":"Quantifying Stimulus Discriminability: A Comparison of Information Theory and Ideal Observer Analysis","volume":"17","author":"Thomson","year":"2005","journal-title":"Neural Comput."},{"key":"ref_57","unstructured":"Pica, G., Piasini, E., Safaai, H., Runyan, C.A., Diamond, M.E., Fellin, T., Kayser, C., Harvey, C.D., and Panzeri, S. (2017, January 4). Quantifying how much sensory information in a neural code is relevant for behavior. Proceedings of the 31st Conference on Neural Information Processing Systems, Long Beach, CA, USA."},{"key":"ref_58","doi-asserted-by":"crossref","first-page":"424","DOI":"10.2307\/1912791","article-title":"Investigating Causal Relations by Econometric Models and Cross-Spectral Methods","volume":"37","author":"Granger","year":"1969","journal-title":"Econometrica"},{"key":"ref_59","doi-asserted-by":"crossref","first-page":"105003","DOI":"10.1088\/1367-2630\/16\/10\/105003","article-title":"Synergy and redundancy in the Granger causal analysis of dynamical networks","volume":"16","author":"Stramaglia","year":"2014","journal-title":"New J. Phys."},{"key":"ref_60","doi-asserted-by":"crossref","first-page":"2518","DOI":"10.1109\/TBME.2016.2559578","article-title":"Synergetic and redundant information flow detected by unnormalized Granger causality: Application to resting state fMRI","volume":"63","author":"Stramaglia","year":"2016","journal-title":"IEEE Trans. Biomed. Eng."},{"key":"ref_61","unstructured":"Williams, P.L., and Beer, R.D. (arXiv, 2011). Generalized Measures of Information Transfer, arXiv."},{"key":"ref_62","doi-asserted-by":"crossref","first-page":"1345","DOI":"10.1109\/TCOM.1973.1091610","article-title":"Bidirectional communication theory\u2014Generalization of information-theory","volume":"12","author":"Marko","year":"1973","journal-title":"IEEE Trans. Commun."},{"key":"ref_63","doi-asserted-by":"crossref","first-page":"461","DOI":"10.1103\/PhysRevLett.85.461","article-title":"Measuring information transfer","volume":"85","author":"Schreiber","year":"2000","journal-title":"Phys. Rev. Lett."},{"key":"ref_64","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1111\/cogs.12142","article-title":"Information Processing and Dynamics in Minimally Cognitive Agents","volume":"39","author":"Beer","year":"2015","journal-title":"Cogn. Sci."},{"key":"ref_65","doi-asserted-by":"crossref","first-page":"64","DOI":"10.3389\/fninf.2014.00064","article-title":"Algorithms of causal inference for the analysis of effective connectivity among brain regions","volume":"8","author":"Chicharro","year":"2014","journal-title":"Front. Neuroinform."},{"key":"ref_66","doi-asserted-by":"crossref","first-page":"958","DOI":"10.1038\/nn.3419","article-title":"Neural coding during active somatosensation revealed using illusory touch","volume":"16","author":"Hires","year":"2013","journal-title":"Nat. Neurosci."},{"key":"ref_67","doi-asserted-by":"crossref","first-page":"358","DOI":"10.1038\/nature16442","article-title":"Acute off-target effects of neural circuit manipulations","volume":"528","author":"Otchy","year":"2015","journal-title":"Nature"}],"container-title":["Entropy"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1099-4300\/20\/3\/169\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T14:57:36Z","timestamp":1760194656000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1099-4300\/20\/3\/169"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2018,3,5]]},"references-count":67,"journal-issue":{"issue":"3","published-online":{"date-parts":[[2018,3]]}},"alternative-id":["e20030169"],"URL":"https:\/\/doi.org\/10.3390\/e20030169","relation":{},"ISSN":["1099-4300"],"issn-type":[{"value":"1099-4300","type":"electronic"}],"subject":[],"published":{"date-parts":[[2018,3,5]]}}}