{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,2,3]],"date-time":"2026-02-03T15:59:46Z","timestamp":1770134386583,"version":"3.49.0"},"reference-count":36,"publisher":"MDPI AG","issue":"4","license":[{"start":{"date-parts":[[2018,3,30]],"date-time":"2018-03-30T00:00:00Z","timestamp":1522368000000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Entropy"],"abstract":"<jats:p>The Partial Information Decomposition, introduced by Williams P. L. et al. (2010), provides a theoretical framework to characterize and quantify the structure of multivariate information sharing. A new method (    I dep    ) has recently been proposed by James R. G. et al. (2017) for computing a two-predictor partial information decomposition over discrete spaces. A lattice of maximum entropy probability models is constructed based on marginal dependency constraints, and the unique information that a particular predictor has about the target is defined as the minimum increase in joint predictor-target mutual information when that particular predictor-target marginal dependency is constrained. Here, we apply the     I dep     approach to Gaussian systems, for which the marginally constrained maximum entropy models are Gaussian graphical models. Closed form solutions for the     I dep     PID are derived for both univariate and multivariate Gaussian systems. Numerical and graphical illustrations are provided, together with practical and theoretical comparisons of the     I dep     PID with the minimum mutual information partial information decomposition (    I mmi    ), which was discussed by Barrett A. B. (2015). The results obtained using     I dep     appear to be more intuitive than those given with other methods, such as     I mmi    , in which the redundant and unique information components are constrained to depend only on the predictor-target marginal distributions. In particular, it is proved that the     I mmi     method generally produces larger estimates of redundancy and synergy than does the     I dep     method. In discussion of the practical examples, the PIDs are complemented by the use of tests of deviance for the comparison of Gaussian graphical models.<\/jats:p>","DOI":"10.3390\/e20040240","type":"journal-article","created":{"date-parts":[[2018,3,30]],"date-time":"2018-03-30T12:43:48Z","timestamp":1522413828000},"page":"240","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":14,"title":["Exact Partial Information Decompositions for Gaussian Systems Based on Dependency Constraints"],"prefix":"10.3390","volume":"20","author":[{"ORCID":"https:\/\/orcid.org\/0000-0002-2189-6999","authenticated-orcid":false,"given":"Jim","family":"Kay","sequence":"first","affiliation":[{"name":"Department of Statistics, University of Glasgow, Glasgow G12 8QQ, UK"}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-8427-0507","authenticated-orcid":false,"given":"Robin","family":"Ince","sequence":"additional","affiliation":[{"name":"Institute of Neuroscience and Psychology, University of Glasgow, Glasgow G12 8QQ, UK"}]}],"member":"1968","published-online":{"date-parts":[[2018,3,30]]},"reference":[{"key":"ref_1","unstructured":"Williams, P.L., and Beer, R.D. (arXiv, 2010). Nonnegative Decomposition of Multivariate Information, arXiv."},{"key":"ref_2","doi-asserted-by":"crossref","unstructured":"James, R.G., Emenheiser, J., and Crutchfield, J.P. (arXiv, 2017). Unique Information via Dependency Constraints, arXiv.","DOI":"10.1088\/1751-8121\/aaed53"},{"key":"ref_3","doi-asserted-by":"crossref","unstructured":"Harder, M., Salge, C., and Polani, D. (2013). Bivariate measure of redundant information. Phys. Rev. E, 87.","DOI":"10.1103\/PhysRevE.87.012130"},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"159","DOI":"10.1007\/978-3-642-53734-9_6","article-title":"Quantifying synergistic mutual information","volume":"Volume 9","author":"Griffith","year":"2014","journal-title":"Guided Self-Organization: Inception. Emergence, Complexity and Computation"},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"2161","DOI":"10.3390\/e16042161","article-title":"Quantifying Unique Information","volume":"16","author":"Bertschinger","year":"2014","journal-title":"Entropy"},{"key":"ref_6","doi-asserted-by":"crossref","unstructured":"Ince, R.A.A. (2017). Measuring multivariate redundant information with pointwise common change in surprisal. Entropy, 19.","DOI":"10.3390\/e19070318"},{"key":"ref_7","unstructured":"Chicharro, D. (arXiv, 2017). Quantifying multivariate redundancy with maximum entropy decompositions of mutual information, arXiv."},{"key":"ref_8","doi-asserted-by":"crossref","unstructured":"Finn, C., and Lizier, J.T. (arXiv, 2018). Pointwise Information Decomposition using the Specificity and Ambiguity Lattices, arXiv.","DOI":"10.3390\/e20040297"},{"key":"ref_9","doi-asserted-by":"crossref","unstructured":"Rauh, J., Bertschinger, N., Olbrich, E., and Jost, J. (July, January 29). Reconsidering unique information: Towards a multivariate information decomposition. Proceedings of the 2014 IEEE International Symposium on Information Theory (ISIT), Honolulu, HI, USA.","DOI":"10.1109\/ISIT.2014.6875230"},{"key":"ref_10","doi-asserted-by":"crossref","unstructured":"Rauh, J. (arXiv, 2017). Secret sharing and shared information, arXiv.","DOI":"10.3390\/e19110601"},{"key":"ref_11","doi-asserted-by":"crossref","unstructured":"Rauh, J., Banerjee, P.K., Olbrich, E., Jost, J., and Bertschinger, N. (arXiv, 2017). On extractable shared information, arXiv.","DOI":"10.3390\/e19070328"},{"key":"ref_12","doi-asserted-by":"crossref","unstructured":"Barrett, A.B. (2015). An exploration of synergistic and redundant information sharing in static and dynamical Gaussian systems. Phys. Rev. E, 91.","DOI":"10.1103\/PhysRevE.91.052802"},{"key":"ref_13","unstructured":"Whittaker, J. (2008). Graphical Models in Applied Multivariate Statistics, Wiley."},{"key":"ref_14","doi-asserted-by":"crossref","unstructured":"Edwards, D. (2000). Introduction to Graphical Modelling, Springer.","DOI":"10.1007\/978-1-4612-0493-0"},{"key":"ref_15","doi-asserted-by":"crossref","unstructured":"Lauritzen, S.L. (1996). Graphical Models, Oxford University Press.","DOI":"10.1093\/oso\/9780198522195.001.0001"},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"157","DOI":"10.2307\/2528966","article-title":"Covariance Selection","volume":"28","author":"Dempster","year":"1972","journal-title":"Biometrics"},{"key":"ref_17","doi-asserted-by":"crossref","first-page":"1477","DOI":"10.1016\/S1053-8119(03)00160-5","article-title":"Multivariate autoregressive modeling of fMRI time series","volume":"19","author":"Harrison","year":"2003","journal-title":"NeuroImage"},{"key":"ref_18","doi-asserted-by":"crossref","first-page":"135","DOI":"10.1016\/S0079-6123(06)59009-0","article-title":"Analyzing event-related EEG data with multivariate autoregressive parameters","volume":"159","author":"Supp","year":"2006","journal-title":"Prog. Brain Res."},{"key":"ref_19","doi-asserted-by":"crossref","unstructured":"Magri, C., Whittingstall, K., Singh, V., Logothetis, N.K., and Panzeri, S. (2009). A toolbox for the fast information analysis of multiple-site LFP, EEG and spike train recordings. BMC Neurosci., 10.","DOI":"10.1186\/1471-2202-10-81"},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"1541","DOI":"10.1002\/hbm.23471","article-title":"A Statistical Framework for Neuroimaging Data Analysis Based on Mutual Information Estimated via a Gaussian Copula","volume":"38","author":"Ince","year":"2017","journal-title":"Hum. Brain Mapp."},{"key":"ref_21","doi-asserted-by":"crossref","first-page":"3501","DOI":"10.3390\/e17053501","article-title":"Information decomposition and synergy","volume":"17","author":"Olbrich","year":"2015","journal-title":"Entropy"},{"key":"ref_22","doi-asserted-by":"crossref","unstructured":"Faes, F., Marinazzo, D., and Stramaglia, S. (2017). Multiscale Information Decomposition: Exact Computation for Multivariate Gaussian Processes. Entropy, 19.","DOI":"10.3390\/e19080408"},{"key":"ref_23","doi-asserted-by":"crossref","unstructured":"Stramaglia, S., Wu, G.-R., Pellicoro, M., and Marinazzo, D. (2012). Expanding the transfer entropy to identify information circuits in complex systems. Phys. Rev. E, 86.","DOI":"10.1103\/PhysRevE.86.066211"},{"key":"ref_24","unstructured":"Mardia, K.V., Kent, J.T., and Bibby, J.M. (1979). Multivariate Analysis, Academic Press."},{"key":"ref_25","unstructured":"Cover, T.M., and Thomas, J.A. (1991). Elements of Information Theory, Wiley-Interscience."},{"key":"ref_26","doi-asserted-by":"crossref","unstructured":"Horn, R.A., and Johnson, C.R. (1985). Matrix Analysis, Cambridge University Press.","DOI":"10.1017\/CBO9780511810817"},{"key":"ref_27","doi-asserted-by":"crossref","unstructured":"Aitchison, J., Kay, J.W., and Lauder, I.J. (2005). Statistical Concepts and Applications in Clinical Medicine, Chapman & Hall.","DOI":"10.4324\/9780203497418"},{"key":"ref_28","doi-asserted-by":"crossref","first-page":"355","DOI":"10.1111\/j.1468-2958.1979.tb00649.x","article-title":"Commonality Analysis: A Method for Decomposing Explained Variance in Multiple regression Analyses","volume":"5","author":"Seibold","year":"1979","journal-title":"Hum. Commun. Res."},{"key":"ref_29","doi-asserted-by":"crossref","first-page":"320","DOI":"10.1111\/2041-210X.12166","article-title":"Using commonality analysis in multiple regressions: A tool to decompose regression effects in the face of multicollinearity","volume":"5","author":"Nimon","year":"2014","journal-title":"Methods Ecol. Evol."},{"key":"ref_30","doi-asserted-by":"crossref","first-page":"345","DOI":"10.1177\/009365027900600305","article-title":"Rationale, Procedures, and Applications for Decomposition of Explained Variation in Multiple Regression Analyses","volume":"6","author":"McPhee","year":"1979","journal-title":"Commun. Res."},{"key":"ref_31","doi-asserted-by":"crossref","first-page":"12","DOI":"10.1016\/j.conb.2015.12.006","article-title":"Multivariate time series analysis of neuroscience data: some challenges and opportunities","volume":"37","author":"Pourahmadi","year":"2016","journal-title":"Curr. Opin. Neurobiol."},{"key":"ref_32","doi-asserted-by":"crossref","unstructured":"Krumin, M., and Shoham, S. (2010). Multivariate Autoregressive Modeling and Granger Causality Analysis of Multiple Spike Trains. Comput. Intell. Neurosci., 2010.","DOI":"10.1155\/2010\/752428"},{"key":"ref_33","doi-asserted-by":"crossref","first-page":"157","DOI":"10.1007\/s001840000055","article-title":"Graphical interaction models of multivariate time series","volume":"51","author":"Dalhaus","year":"2000","journal-title":"Metrika"},{"key":"ref_34","first-page":"1","article-title":"Remarks concerning the graphical models for time series and point processes","volume":"16","author":"Brillinger","year":"1996","journal-title":"Braz. Rev. Econ."},{"key":"ref_35","doi-asserted-by":"crossref","unstructured":"Palomar, D.P., and Eldar, Y.C. (2009). Graphical models of autoregressive processes. Convex Optimization in Signal Processing and Communications, Cambridge University Press.","DOI":"10.1017\/CBO9780511804458"},{"key":"ref_36","doi-asserted-by":"crossref","unstructured":"Meyer, C.D. (2000). Matrix Analysis and Applied Linear Algebra, Society for Applied and Industrial Mathematics.","DOI":"10.1137\/1.9780898719512"}],"container-title":["Entropy"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1099-4300\/20\/4\/240\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T14:59:09Z","timestamp":1760194749000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1099-4300\/20\/4\/240"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2018,3,30]]},"references-count":36,"journal-issue":{"issue":"4","published-online":{"date-parts":[[2018,4]]}},"alternative-id":["e20040240"],"URL":"https:\/\/doi.org\/10.3390\/e20040240","relation":{},"ISSN":["1099-4300"],"issn-type":[{"value":"1099-4300","type":"electronic"}],"subject":[],"published":{"date-parts":[[2018,3,30]]}}}