{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,10,12]],"date-time":"2025-10-12T03:46:24Z","timestamp":1760240784239,"version":"build-2065373602"},"reference-count":81,"publisher":"MDPI AG","issue":"9","license":[{"start":{"date-parts":[[2019,9,3]],"date-time":"2019-09-03T00:00:00Z","timestamp":1567468800000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Entropy"],"abstract":"<jats:p>Partial information decomposition (PID) separates the contributions of sources about a target into unique, redundant, and synergistic components of information. In essence, PID answers the question of \u201cwho knows what\u201d of a system of random variables and hence has applications to a wide spectrum of fields ranging from social to biological sciences. The paper presents MaxEnt3D_Pid, an algorithm that computes the PID of three sources, based on a recently-proposed maximum entropy measure, using convex optimization (cone programming). We describe the algorithm and its associated software utilization and report the results of various experiments assessing its accuracy. Moreover, the paper shows that a hierarchy of bivariate and trivariate PID allows obtaining the finer quantities of the trivariate partial information measure.<\/jats:p>","DOI":"10.3390\/e21090862","type":"journal-article","created":{"date-parts":[[2019,9,4]],"date-time":"2019-09-04T08:28:13Z","timestamp":1567585693000},"page":"862","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":3,"title":["MAXENT3D_PID: An Estimator for the Maximum-Entropy Trivariate Partial Information Decomposition"],"prefix":"10.3390","volume":"21","author":[{"ORCID":"https:\/\/orcid.org\/0000-0002-3581-8262","authenticated-orcid":false,"given":"Abdullah","family":"Makkeh","sequence":"first","affiliation":[{"name":"Institute of Computer Science, University of Tartu, 51014 Tartu, Estonia"}]},{"given":"Daniel","family":"Chicharro","sequence":"additional","affiliation":[{"name":"Neural Computation Laboratory, Center for Neuroscience and Cognitive Systems@UniTn, Istituto Italiano di Tecnologia, 38068 Rovereto (TN), Italy"}]},{"given":"Dirk Oliver","family":"Theis","sequence":"additional","affiliation":[{"name":"Institute of Computer Science, University of Tartu, 51014 Tartu, Estonia"}]},{"given":"Raul","family":"Vicente","sequence":"additional","affiliation":[{"name":"Institute of Computer Science, University of Tartu, 51014 Tartu, Estonia"}]}],"member":"1968","published-online":{"date-parts":[[2019,9,3]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"1701","DOI":"10.1109\/18.930911","article-title":"Information Geometry on Hierarchy of Probability Distributions","volume":"47","author":"Amari","year":"2001","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_2","doi-asserted-by":"crossref","unstructured":"Schneidman, E., Still, S., Berry, M.J., and Bialek, W. (2003). Network Information and Connected Correlations. Phys. Rev. Lett., 91.","DOI":"10.1103\/PhysRevLett.91.238701"},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"119","DOI":"10.1007\/s10827-013-0458-4","article-title":"Synergy, Redundancy, and Multivariate Information Measures: An Experimentalist\u2019s Perspective","volume":"36","author":"Timme","year":"2014","journal-title":"J. Comput. Neurosci."},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"3501","DOI":"10.3390\/e17053501","article-title":"Information Decomposition and Synergy","volume":"17","author":"Olbrich","year":"2015","journal-title":"Entropy"},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"35","DOI":"10.3389\/frobt.2015.00035","article-title":"Hierarchical Quantification of Synergy in Channels","volume":"2","author":"Perrone","year":"2016","journal-title":"Front. Robot. AI"},{"key":"ref_6","unstructured":"Williams, P.L., and Beer, R.D. (2010). Nonnegative Decomposition of Multivariate Information. arXiv."},{"key":"ref_7","unstructured":"Williams, P.L. (2011). Information Dynamics: Its Theory and Application to Embodied Cognitive Systems. [Ph.D. Thesis, Indiana University]."},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"83","DOI":"10.1038\/msb4100124","article-title":"Computational Analysis of the Synergy among Multiple Interacting Genes","volume":"3","author":"Anastassiou","year":"2007","journal-title":"Mol. Syst. Biol."},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"302","DOI":"10.1111\/j.1749-6632.2008.03757.x","article-title":"Inference of Regulatory Gene Interactions from Expression Data Using Three-way Mutual Information","volume":"1158","author":"Watkinson","year":"2009","journal-title":"Ann. N. Y. Acad. Sci."},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"250","DOI":"10.1016\/j.gene.2016.05.029","article-title":"Construction of Synergy Networks from Gene Expression Data Related to Disease","volume":"590","author":"Chatterjee","year":"2016","journal-title":"Gene"},{"key":"ref_11","doi-asserted-by":"crossref","first-page":"18720","DOI":"10.1073\/pnas.1107583108","article-title":"Inferring the Structure and Dynamics of Interactions in Schooling Fish","volume":"108","author":"Katz","year":"2011","journal-title":"Proc. Natl. Acad. Sci. USA"},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"1802","DOI":"10.1098\/rstb.2011.0214","article-title":"Multiple Time-scales and the Developmental Dynamics of Social Systems","volume":"367","author":"Flack","year":"2012","journal-title":"Philos. Trans. R. Soc. B Biol. Sci."},{"key":"ref_13","doi-asserted-by":"crossref","first-page":"125","DOI":"10.1007\/s12064-011-0140-1","article-title":"Information-Driven Self-Organization: The Dynamical System Approach to Autonomous Robot Behavior","volume":"131","author":"Ay","year":"2012","journal-title":"Theory Biosci."},{"key":"ref_14","doi-asserted-by":"crossref","first-page":"1457","DOI":"10.1111\/cogs.12632","article-title":"Synergistic Information Processing Encrypts Strategic Reasoning in Poker","volume":"42","author":"Frey","year":"2018","journal-title":"Cogn. Sci."},{"key":"ref_15","doi-asserted-by":"crossref","unstructured":"Marre, O., El Boustani, S., Fr\u00e9gnac, Y., and Destexhe, A. (2009). Prediction of Spatiotemporal Patterns of Neural Activity from Pairwise Correlations. Phys. Rev. Lett., 102.","DOI":"10.1103\/PhysRevLett.102.138101"},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"2488","DOI":"10.1109\/TBME.2016.2569823","article-title":"An Information-Theoretic Framework to Map the Spatiotemporal Dynamics of the Scalp Electroencephalogram","volume":"63","author":"Faes","year":"2016","journal-title":"IEEE. Trans. Biomed. Eng."},{"key":"ref_17","unstructured":"Pica, G., Piasini, E., Safaai, H., Runyan, C.A., Diamond, M.E., Fellin, T., Kayser, C., Harvey, C.D., and Panzeri, S. (2017, January 4\u20139). Quantifying How Much Sensory Information in a Neural Code is Relevant for Behavior. Proceedings of the 31st Conference on Neural Information Processing Systems (NIPS 2017), Long Beach, CA, USA."},{"key":"ref_18","doi-asserted-by":"crossref","first-page":"5195","DOI":"10.1523\/JNEUROSCI.5319-04.2005","article-title":"Synergy, Redundancy, and Independence in Population Codes, Revisited","volume":"25","author":"Latham","year":"2005","journal-title":"J. Neurosci."},{"key":"ref_19","doi-asserted-by":"crossref","unstructured":"Ver Steeg, G., Brekelmans, R., Harutyunyan, H., and Galstyan, A. (2017). Disentangled Representations via Synergy Minimization. arXiv.","DOI":"10.1109\/ALLERTON.2017.8262735"},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"63","DOI":"10.1007\/s12064-013-0186-3","article-title":"Robustness, Canalyzing Functions and Systems Design","volume":"133","author":"Rauh","year":"2014","journal-title":"Theory Biosci."},{"key":"ref_21","unstructured":"Tishby, N., Pereira, F.C., and Bialek, W. (2000). The Information Bottleneck Method. arXiv."},{"key":"ref_22","unstructured":"Banerjee, P.K.R., and Mont\u00fafar, G. (2018). The Variational Deficiency Bottleneck. arXiv."},{"key":"ref_23","doi-asserted-by":"crossref","unstructured":"Harder, M., Salge, C., and Polani, D. (2013). Bivariate Measure of Redundant Information. Phys. Rev. E, 87.","DOI":"10.1103\/PhysRevE.87.012130"},{"key":"ref_24","doi-asserted-by":"crossref","first-page":"2161","DOI":"10.3390\/e16042161","article-title":"Quantifying Unique Information","volume":"16","author":"Bertschinger","year":"2014","journal-title":"Entropy"},{"key":"ref_25","doi-asserted-by":"crossref","unstructured":"Griffith, V., and Koch, C. (2014). Quantifying Synergistic Mutual Information. Guided Self-Organization: Inception, Springer.","DOI":"10.1007\/978-3-642-53734-9_6"},{"key":"ref_26","doi-asserted-by":"crossref","unstructured":"Ince, R.A.A. (2017). Measuring Multivariate Redundant Information with Pointwise Common Change in Surprisal. Entropy, 19.","DOI":"10.3390\/e19070318"},{"key":"ref_27","doi-asserted-by":"crossref","unstructured":"James, R.G., Emenheiser, J., and Crutchfield, J.P. (2017). Unique Information via Dependency Constraints. arXiv.","DOI":"10.1088\/1751-8121\/aaed53"},{"key":"ref_28","doi-asserted-by":"crossref","unstructured":"Chicharro, D., and Panzeri, S. (2017). Synergy and Redundancy in Dual Decompositions of Mutual Information Gain and Information Loss. Entropy, 19.","DOI":"10.3390\/e19020071"},{"key":"ref_29","doi-asserted-by":"crossref","unstructured":"Finn, C., and Lizier, J.T. (2018). Pointwise Information Decomposition Using the Specificity and Ambiguity Lattices. arXiv.","DOI":"10.3390\/e20040297"},{"key":"ref_30","doi-asserted-by":"crossref","unstructured":"Rauh, J. (2017). Secret Sharing and Shared Information. Entropy, 19.","DOI":"10.3390\/e19110601"},{"key":"ref_31","doi-asserted-by":"crossref","unstructured":"James, R.G., Emenheiser, J., and Crutchfield, J.P. (2017). A Perspective on Unique Information: Directionality, Intuitions, and Secret Key Agreement. arXiv.","DOI":"10.3390\/e21010012"},{"key":"ref_32","doi-asserted-by":"crossref","unstructured":"Chicharro, D., Pica, G., and Panzeri, S. (2018). The Identity of Information: How Deterministic Dependencies Constrain Information Synergy and Redundancy. Entropy, 20.","DOI":"10.3390\/e20030169"},{"key":"ref_33","doi-asserted-by":"crossref","unstructured":"Rauh, J., Banerjee, P.K., Olbrich, E., Jost, J., and Bertschinger, N. (2017). On Extractable Shared Information. Entropy, 19.","DOI":"10.3390\/e19070328"},{"key":"ref_34","doi-asserted-by":"crossref","unstructured":"Cover, T.M., and Thomas, J.A. (2006). Elements of Information Theory, Wiley. [2nd ed.].","DOI":"10.1002\/047174882X"},{"key":"ref_35","doi-asserted-by":"crossref","first-page":"5","DOI":"10.3389\/frobt.2015.00005","article-title":"Bits from Brains for Biologically Inspired Computing","volume":"2","author":"Wibral","year":"2015","journal-title":"Front. Robot. AI"},{"key":"ref_36","doi-asserted-by":"crossref","first-page":"379","DOI":"10.1002\/j.1538-7305.1948.tb01338.x","article-title":"A mathematical theory of communication","volume":"27","author":"Shannon","year":"1948","journal-title":"Bell Syst. Tech. J."},{"key":"ref_37","doi-asserted-by":"crossref","unstructured":"Pica, G., Piasini, E., Chicharro, D., and Panzeri, S. (2017). Invariant Components of Synergy, Redundancy, and Unique Information among Three Variables. Entropy, 19.","DOI":"10.3390\/e19090451"},{"key":"ref_38","unstructured":"Chicharro, D. (2017). Quantifying Multivariate Redundancy with Maximum Entropy Decompositions of Mutual Information. arXiv."},{"key":"ref_39","doi-asserted-by":"crossref","unstructured":"Makkeh, A., Theis, D.O., and Vicente, R. (2017). Bivariate Partial Information Decomposition: The Optimization Perspective. Entropy, 19.","DOI":"10.3390\/e19100530"},{"key":"ref_40","doi-asserted-by":"crossref","unstructured":"Banerjee, P.K., Rauh, J., and Mont\u00fafar, G. (2018). Computing the Unique Information. arXiv.","DOI":"10.1109\/ISIT.2018.8437757"},{"key":"ref_41","doi-asserted-by":"crossref","unstructured":"Makkeh, A., Theis, D.O., and Vicente, R. (2018). Broja_2Pid: A Robust Estimator for Bivariate Partial Information Decomposition. Entropy, 20.","DOI":"10.3390\/e20040271"},{"key":"ref_42","doi-asserted-by":"crossref","first-page":"738","DOI":"10.21105\/joss.00738","article-title":"dit: A Python Package for Discrete Information Theory","volume":"3","author":"James","year":"2018","journal-title":"J. Open Source Softw."},{"key":"ref_43","doi-asserted-by":"crossref","unstructured":"Bertschinger, N., Rauh, J., Olbrich, E., and Jost, J. (2012). Shared Information\u2014New Insights and Problems in Decomposing Information in Complex Systems. Proceedings of the European Conference on Complex Systems 2012, Springer.","DOI":"10.1007\/978-3-319-00395-5_35"},{"key":"ref_44","unstructured":"Burnham, K.P., and Anderson, D.R. (2002). Model Selection and Multimodel Inference: A Practical Information-Theoretic Approach, Springer. [2nd ed.]."},{"key":"ref_45","doi-asserted-by":"crossref","first-page":"2518","DOI":"10.1109\/TBME.2016.2559578","article-title":"Synergetic and Redundant Information Flow Detected by Unnormalized Granger Causality: Application to Resting State fMRI","volume":"63","author":"Stramaglia","year":"2016","journal-title":"IEEE. Trans. Biomed. Eng."},{"key":"ref_46","doi-asserted-by":"crossref","unstructured":"Wibral, M., Finn, C., Wollstadt, P., Lizier, J.T., and Priesemann, V. (2017). Quantifying Information Modification in Developing Neural Networks via Partial Information Decomposition. Entropy, 19.","DOI":"10.3390\/e19090494"},{"key":"ref_47","doi-asserted-by":"crossref","unstructured":"Ghazi-Zahedi, K., Langer, C., and Ay, N. (2017). Morphological Computation: Synergy of Body and Brain. Entropy, 19.","DOI":"10.3390\/e19090456"},{"key":"ref_48","doi-asserted-by":"crossref","unstructured":"Faes, L., Marinazzo, D., and Stramaglia, S. (2017). Multiscale Information Decomposition: Exact Computation for Multivariate Gaussian Processes. Entropy, 19.","DOI":"10.3390\/e19080408"},{"key":"ref_49","doi-asserted-by":"crossref","unstructured":"Tax, T.M.S., Mediano, P.A.M., and Shanahan, M. (2017). The Partial Information Decomposition of Generative Neural Network Models. Entropy, 19.","DOI":"10.3390\/e19090474"},{"key":"ref_50","unstructured":"Schwartz-Ziv, R., and Tishby, N. (2017). Opening the Black Box of Deep Neural Networks via Information. arXiv."},{"key":"ref_51","unstructured":"Makkeh, A. (2018). Applications of Optimization in Some Complex Systems. [Ph.D. Thesis, University of Tartu]."},{"key":"ref_52","doi-asserted-by":"crossref","unstructured":"Domahidi, A., Chu, E., and Boyd, S. (2013, January 17\u201319). ECOS: An SOCP Solver for Embedded Systems. Proceedings of the European Control Conference, Zurich, Switzerland.","DOI":"10.23919\/ECC.2013.6669541"},{"key":"ref_53","doi-asserted-by":"crossref","unstructured":"Makkeh, A., Theis, D.O., Vicente, R., and Chicharro, D. (2018, June 21). A Trivariate PID Estimator. Available online: https:\/\/github.com\/Abzinger\/MAXENT3Dunderlinetag|PID.","DOI":"10.3390\/e21090862"},{"key":"ref_54","doi-asserted-by":"crossref","first-page":"575","DOI":"10.1137\/0802028","article-title":"On the implementation of a primal-dual interior point method","volume":"2","author":"Mehrotra","year":"1992","journal-title":"SIAM J. Optim."},{"key":"ref_55","doi-asserted-by":"crossref","first-page":"281","DOI":"10.1016\/S0377-0427(00)00433-7","article-title":"Interior-point methods","volume":"124","author":"Potra","year":"2000","journal-title":"J. Comput. Appl. Math."},{"key":"ref_56","doi-asserted-by":"crossref","unstructured":"James, R.G., and Crutchfield, J.P. (2017). Multivariate Dependence Beyond Shannon Information. Entropy, 19.","DOI":"10.3390\/e19100531"},{"key":"ref_57","doi-asserted-by":"crossref","unstructured":"Lizier, J.T., Flecker, B., and Williams, P.L. (2013, January 16\u201319). Towards a Synergy-based Approach to Measuring Information Modification. Proceedings of the 2013 IEEE Symposium on Artificial Life (ALife), Singapore.","DOI":"10.1109\/ALIFE.2013.6602430"},{"key":"ref_58","doi-asserted-by":"crossref","first-page":"25","DOI":"10.1016\/j.bandc.2015.09.004","article-title":"Partial Information Decomposition as a Unified Approach to the Specification of Neural Goal Functions","volume":"112","author":"Wibral","year":"2015","journal-title":"Brain Cogn."},{"key":"ref_59","unstructured":"Banerjee, P.K., and Griffith, V. (2015). Synergy, Redundancy, and Common Information. arXiv."},{"key":"ref_60","doi-asserted-by":"crossref","unstructured":"Kay, J.W., and Ince, R.A.A. (2018). Exact Partial Information Decompositions for Gaussian Systems Based on Dependency Constraints. arXiv.","DOI":"10.3390\/e20040240"},{"key":"ref_61","doi-asserted-by":"crossref","first-page":"283","DOI":"10.1007\/s11721-018-0157-x","article-title":"Informative and Misinformative Interactions in a School of Fish","volume":"12","author":"Crosato","year":"2018","journal-title":"Swarm Intell."},{"key":"ref_62","doi-asserted-by":"crossref","unstructured":"Sootla, S., Theis, D.O., and Vicente, R. (2017). Analyzing Information Distribution in Complex Systems. Entropy, 19.","DOI":"10.3390\/e19120636"},{"key":"ref_63","doi-asserted-by":"crossref","first-page":"141","DOI":"10.1038\/nrg2499","article-title":"The Evolution of Hierarchical Gene Regulatory Networks","volume":"10","author":"Erwin","year":"2009","journal-title":"Nat. Rev. Genet."},{"key":"ref_64","doi-asserted-by":"crossref","first-page":"3311","DOI":"10.1016\/S0042-6989(97)00169-7","article-title":"Sparse Coding with an Overcomplete Basis Set: A Strategy Employed by V1?","volume":"37","author":"Olshausen","year":"1997","journal-title":"Vis. Res."},{"key":"ref_65","doi-asserted-by":"crossref","first-page":"6908","DOI":"10.1073\/pnas.1506855112","article-title":"Predictive Information in a Sensory Population","volume":"112","author":"Palmer","year":"2015","journal-title":"Proc. Natl. Acad. Sci. USA"},{"key":"ref_66","doi-asserted-by":"crossref","first-page":"032904","DOI":"10.1103\/PhysRevE.91.032904","article-title":"Estimating the Decomposition of Predictive Information in Multivariate Systems","volume":"91","author":"Faes","year":"2015","journal-title":"Phys. Rev. E"},{"key":"ref_67","doi-asserted-by":"crossref","unstructured":"Chicharro, D., and Ledberg, A. (2012). Framework to Study Dynamic Dependencies in Networks of Interacting Processes. Phys. Rev. E, 86.","DOI":"10.1103\/PhysRevE.86.041901"},{"key":"ref_68","doi-asserted-by":"crossref","first-page":"461","DOI":"10.1103\/PhysRevLett.85.461","article-title":"Measuring Information Transfer","volume":"85","author":"Schreiber","year":"2000","journal-title":"Phys. Rev. Lett."},{"key":"ref_69","doi-asserted-by":"crossref","first-page":"3416","DOI":"10.3390\/e16063416","article-title":"Identifying the coupling structure in complex systems through the optimal causation entropy principle","volume":"16","author":"Sun","year":"2014","journal-title":"Entropy"},{"key":"ref_70","doi-asserted-by":"crossref","first-page":"45","DOI":"10.1007\/s10827-010-0262-3","article-title":"Transfer Entropy: A Model-free Measure of Effective Connectivity for the Neurosciences","volume":"30","author":"Vicente","year":"2011","journal-title":"J. Comput. Neurosci."},{"key":"ref_71","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1016\/j.physrep.2006.12.004","article-title":"Causality detection based on information-theoretic approaches in time series analysis","volume":"441","author":"Vejmelka","year":"2007","journal-title":"Phys. Rep."},{"key":"ref_72","doi-asserted-by":"crossref","unstructured":"Vicente, R., and Wibral, M. (2014). Efficient estimation of information transfer. Directed Information Measures in Neuroscience, Springer.","DOI":"10.1007\/978-3-642-54474-3_2"},{"key":"ref_73","doi-asserted-by":"crossref","first-page":"339","DOI":"10.1016\/j.neuroimage.2011.03.058","article-title":"Effective Connectivity: Influence, Causality and Biophysical Modeling","volume":"58","author":"Roebroeck","year":"2011","journal-title":"NeuroImage"},{"key":"ref_74","doi-asserted-by":"crossref","unstructured":"Wibral, M., Vicente, R., and Lizier, J.T. (2014). Directed Information Measures in Neuroscience, Springer.","DOI":"10.1007\/978-3-642-54474-3"},{"key":"ref_75","doi-asserted-by":"crossref","unstructured":"Wibral, M., Vicente, R., and Lindner, M. (2014). Transfer entropy in neuroscience. Directed Information Measures in Neuroscience, Springer.","DOI":"10.1007\/978-3-642-54474-3"},{"key":"ref_76","doi-asserted-by":"crossref","first-page":"430","DOI":"10.1038\/nrn3963","article-title":"Rethinking Segregation and Integration: Contributions of Whole-brain Modelling","volume":"16","author":"Deco","year":"2015","journal-title":"Nat. Rev. Neurosci."},{"key":"ref_77","doi-asserted-by":"crossref","first-page":"106","DOI":"10.1016\/j.conb.2016.01.012","article-title":"Quantifying Collectivity","volume":"37","author":"Daniels","year":"2016","journal-title":"Curr. Opin. Neurobiol."},{"key":"ref_78","doi-asserted-by":"crossref","first-page":"691","DOI":"10.1162\/neco.1992.4.5.691","article-title":"Local Synaptic Learning Rules Suffice to Maximize Mutual Information in a Linear Network","volume":"4","author":"Linsker","year":"1992","journal-title":"Neural Comput."},{"key":"ref_79","doi-asserted-by":"crossref","first-page":"1129","DOI":"10.1162\/neco.1995.7.6.1129","article-title":"An Information Maximisation Approach to Blind Separation and Blind Deconvolution","volume":"7","author":"Bell","year":"1995","journal-title":"Neural Comput."},{"key":"ref_80","doi-asserted-by":"crossref","first-page":"052802","DOI":"10.1103\/PhysRevE.91.052802","article-title":"Exploration of Synergistic and Redundant Information Sharing in Static and Dynamical Gaussian Systems","volume":"91","author":"Barrett","year":"2015","journal-title":"Phys. Rev. E"},{"key":"ref_81","doi-asserted-by":"crossref","first-page":"11","DOI":"10.3389\/frobt.2014.00011","article-title":"JIDT: An information-theoretic toolkit for studying the dynamics of complex systems","volume":"1","author":"Lizier","year":"2014","journal-title":"Front. Robot. AI"}],"container-title":["Entropy"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1099-4300\/21\/9\/862\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T13:16:32Z","timestamp":1760188592000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1099-4300\/21\/9\/862"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2019,9,3]]},"references-count":81,"journal-issue":{"issue":"9","published-online":{"date-parts":[[2019,9]]}},"alternative-id":["e21090862"],"URL":"https:\/\/doi.org\/10.3390\/e21090862","relation":{},"ISSN":["1099-4300"],"issn-type":[{"type":"electronic","value":"1099-4300"}],"subject":[],"published":{"date-parts":[[2019,9,3]]}}}