{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,7]],"date-time":"2026-03-07T07:05:30Z","timestamp":1772867130924,"version":"3.50.1"},"reference-count":42,"publisher":"MDPI AG","issue":"6","license":[{"start":{"date-parts":[[2012,6,19]],"date-time":"2012-06-19T00:00:00Z","timestamp":1340064000000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/3.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Entropy"],"abstract":"<jats:p>The application of the Maximum Entropy (ME) principle leads to a minimum of the Mutual Information (MI), I(X,Y), between random variables X,Y, which is compatible with prescribed joint expectations and given ME marginal distributions. A sequence of sets of joint constraints leads to a hierarchy of lower MI bounds increasingly approaching the true MI. In particular, using standard bivariate Gaussian marginal distributions, it allows for the MI decomposition into two positive terms: the Gaussian MI (Ig), depending upon the Gaussian correlation or the correlation between \u2018Gaussianized variables\u2019, and a non\u2011Gaussian MI (Ing), coinciding with joint negentropy and depending upon nonlinear correlations. Joint moments of a prescribed total order p are bounded within a compact set defined by Schwarz-like inequalities, where Ing grows from zero at the \u2018Gaussian manifold\u2019 where moments are those of Gaussian distributions, towards infinity at the set\u2019s boundary where a deterministic relationship holds. Sources of joint non-Gaussianity have been systematized by estimating Ing between the input and output from a nonlinear synthetic channel contaminated by multiplicative and non-Gaussian additive noises for a full range of signal-to-noise ratio (snr) variances. We have studied the effect of varying snr on Ig and Ing under several signal\/noise scenarios.<\/jats:p>","DOI":"10.3390\/e14061103","type":"journal-article","created":{"date-parts":[[2012,6,19]],"date-time":"2012-06-19T12:49:40Z","timestamp":1340110180000},"page":"1103-1126","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":18,"title":["Minimum Mutual Information and Non-Gaussianity Through the Maximum Entropy Method: Theory and Properties"],"prefix":"10.3390","volume":"14","author":[{"given":"Carlos A. L.","family":"Pires","sequence":"first","affiliation":[{"name":"Instituto Dom Luiz, Faculdade de Ci\u00eancias, University of Lisbon, DEGGE, Ed. C8, Campo-Grande, 1749-016 Lisbon, Portugal"}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-5543-1754","authenticated-orcid":false,"given":"Rui A. P.","family":"Perdig\u00e3o","sequence":"additional","affiliation":[{"name":"Instituto Dom Luiz, Faculdade de Ci\u00eancias, University of Lisbon, DEGGE, Ed. C8, Campo-Grande, 1749-016 Lisbon, Portugal"}]}],"member":"1968","published-online":{"date-parts":[[2012,6,19]]},"reference":[{"key":"ref_1","unstructured":"Cover, T.M., and Thomas, J.A. (1991). Elements of Information Theory, John Wiley & Sons, Inc."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"383","DOI":"10.1111\/j.1751-5823.2010.00105.x","article-title":"Information measures in perspective","volume":"78","author":"Ebrahimi","year":"2010","journal-title":"Int. Stat. Rev."},{"key":"ref_3","first-page":"066123:1","article-title":"Least-dependent-component analysis based on mutual information","volume":"70","author":"Kraskov","year":"2004","journal-title":"Phys. Rev. E"},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"411","DOI":"10.1016\/S0893-6080(00)00026-5","article-title":"Independent Component Analysis: Algorithms and application","volume":"13","author":"Oja","year":"2000","journal-title":"Neural Network."},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"1134","DOI":"10.1103\/PhysRevA.33.1134","article-title":"Independent coordinates for strange attractors from mutual information","volume":"33","author":"Fraser","year":"1986","journal-title":"Phys. Rev. A"},{"key":"ref_6","doi-asserted-by":"crossref","first-page":"2425","DOI":"10.1175\/1520-0469(2004)061<2425:PAITPI>2.0.CO;2","article-title":"Predictability and information theory. Part I: Measures of predictability","volume":"61","author":"DelSole","year":"2004","journal-title":"J. Atmos. Sci."},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"425","DOI":"10.4310\/MAA.2002.v9.n3.a8","article-title":"A mathematical framework for quantifying predictability through relative entropy. Methods and applications of analysis","volume":"9","author":"Majda","year":"2002","journal-title":"Meth. Appl. Anal."},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"709","DOI":"10.1109\/18.825848","article-title":"Entropy expressions for multivariate continuous distributions","volume":"46","author":"Darbellay","year":"2000","journal-title":"IEEE Trans. Inform. Theor."},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"173","DOI":"10.1016\/j.ins.2004.02.020","article-title":"Expressions for R\u00e9nyi and Shannon entropies for bivariate distributions","volume":"170","author":"Nadarajah","year":"2005","journal-title":"Inform. Sci."},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"026209:1","DOI":"10.1103\/PhysRevE.76.026209","article-title":"Relative performance of mutual information estimation methods for quantifying the dependence among short and noisy data","volume":"76","author":"Khan","year":"2007","journal-title":"Phys. Rev. E"},{"key":"ref_11","doi-asserted-by":"crossref","first-page":"389","DOI":"10.1007\/978-3-642-01307-2_36","article-title":"Estimation of mutual information: A survey","volume":"5589","author":"Li","year":"2009","journal-title":"Lect. Notes Comput. Sci."},{"key":"ref_12","unstructured":"Globerson, A., and Tishby, N. (2004, January 7\u201311). The minimum information principle for discriminative learning. Proceedings of the 20th conference on Uncertainty in artificial intelligence, Banff, Canada."},{"key":"ref_13","doi-asserted-by":"crossref","first-page":"939","DOI":"10.1109\/PROC.1982.12425","article-title":"On the rationale of maximum-entropy methods","volume":"70","author":"Jaynes","year":"1982","journal-title":"Proc. IEEE"},{"key":"ref_14","doi-asserted-by":"crossref","unstructured":"Wackernagel, H. (1995). Multivariate Geostatistics\u2014An Introduction with Applications, Springer-Verlag.","DOI":"10.1007\/978-3-662-03098-1"},{"key":"ref_15","doi-asserted-by":"crossref","first-page":"1026","DOI":"10.1080\/03610920902846521","article-title":"Convergent iterative procedure for constructing bivariate distributions","volume":"39","author":"Shams","year":"2010","journal-title":"Comm. Stat. Theor. Meth."},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"1217","DOI":"10.1016\/j.jmva.2007.08.004","article-title":"Multivariate maximum entropy identification, transformation, and dependence","volume":"99","author":"Ebrahimi","year":"2008","journal-title":"J. Multivariate Anal."},{"key":"ref_17","doi-asserted-by":"crossref","first-page":"621","DOI":"10.1016\/j.jcp.2007.04.026","article-title":"An improved algorithm for the multidimensional moment-constrained maximum entropy problem","volume":"226","author":"Abramov","year":"2007","journal-title":"J. Comput. Phys."},{"key":"ref_18","doi-asserted-by":"crossref","first-page":"96","DOI":"10.1016\/j.jcp.2008.08.020","article-title":"The multidimensional moment-constrained maximum entropy problem: A BFGS algorithm with constraint scaling","volume":"228","author":"Abramov","year":"2009","journal-title":"J. Comput. Phys."},{"key":"ref_19","doi-asserted-by":"crossref","first-page":"377","DOI":"10.4310\/CMS.2010.v8.n2.a5","article-title":"The multidimensional maximum entropy moment problem: A review on numerical methods","volume":"8","author":"Abramov","year":"2010","journal-title":"Commun. Math. Sci."},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"119","DOI":"10.1016\/S0304-4076(01)00092-6","article-title":"Entropy densities with an application to autoregressive conditional skewness and kurtosis","volume":"106","author":"Rockinger","year":"2002","journal-title":"J. Econometrics"},{"key":"ref_21","doi-asserted-by":"crossref","first-page":"430","DOI":"10.1175\/MWR3407.1","article-title":"Non-Gaussianity and asymmetry of the winter monthly precipitation estimation from the NAO","volume":"135","author":"Pires","year":"2007","journal-title":"Mon. Wea. Rev."},{"key":"ref_22","doi-asserted-by":"crossref","first-page":"066138:1","DOI":"10.1103\/PhysRevE.69.066138","article-title":"Estimating mutual information","volume":"69","author":"Kraskov","year":"2004","journal-title":"Phys. Rev. E"},{"key":"ref_23","unstructured":"Myers, J.L., and Well, A.D. (2003). Research Design and Statistical Analysis, Lawrence Erlbaum Associates. [2nd ed.]."},{"key":"ref_24","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1209\/0295-5075\/88\/68003","article-title":"An information-theoretic approach to statistical dependence: Copula information","volume":"88","author":"Calsaverini","year":"2009","journal-title":"Europhys. Lett."},{"key":"ref_25","doi-asserted-by":"crossref","first-page":"57","DOI":"10.5194\/npg-16-57-2009","article-title":"Information theoretic measures of dependence, compactness, and non-Gaussianity for multivariate probability distributions","volume":"16","author":"Monahan","year":"2009","journal-title":"Nonlinear Proc. Geoph."},{"key":"ref_26","doi-asserted-by":"crossref","first-page":"1261","DOI":"10.1109\/TIT.2005.844072","article-title":"Mutual information and minimum mean-square error in Gaussian channels","volume":"51","author":"Guo","year":"2005","journal-title":"IEEE Trans. Inform. Theor."},{"key":"ref_27","doi-asserted-by":"crossref","unstructured":"Pires, C.A., and Perdig\u00e3o, R.A.P. (2012). Minimum mutual information and non-Gaussianity through the maximum entropy method: Estimation from finite samples. Entropy, submitted for publication.","DOI":"10.3390\/e14061103"},{"key":"ref_28","doi-asserted-by":"crossref","first-page":"26","DOI":"10.1109\/TIT.1980.1056144","article-title":"Axiomatic derivation of the principle of maximum entropy and the principle of the minimum cross-entropy","volume":"26","author":"Shore","year":"1980","journal-title":"IEEE Trans. Inform. Theor."},{"key":"ref_29","doi-asserted-by":"crossref","first-page":"261","DOI":"10.1006\/jmva.1995.1079","article-title":"Constructing multivariate distributions with specific marginal distributions","volume":"55","author":"Koehler","year":"1995","journal-title":"J. Multivariate Anal."},{"key":"ref_30","first-page":"19","article-title":"Generation of multivariate random variables with known marginal distribution and a specified correlation matrix","volume":"16","year":"2010","journal-title":"InterStat"},{"key":"ref_31","doi-asserted-by":"crossref","first-page":"1903","DOI":"10.1162\/0899766054323026","article-title":"Edgeworth approximation of multivariate differential entropy","volume":"17","year":"2005","journal-title":"Neural Computat."},{"key":"ref_32","doi-asserted-by":"crossref","first-page":"287","DOI":"10.1016\/0165-1684(94)90029-9","article-title":"Independent component analysis, a new concept?","volume":"36","author":"Comon","year":"1994","journal-title":"Signal Process."},{"key":"ref_33","doi-asserted-by":"crossref","unstructured":"van der Vaart, A.W. (1998). Asymptotic Statistics, Cambridge University Press.","DOI":"10.1017\/CBO9780511802256"},{"key":"ref_34","doi-asserted-by":"crossref","first-page":"342","DOI":"10.1007\/BF01443605","article-title":"\u00dcber die Darstellung definiter Formen als Summe von Formenquadraten","volume":"32","author":"Hilbert","year":"1888","journal-title":"Math. Ann."},{"key":"ref_35","doi-asserted-by":"crossref","unstructured":"Ahmadi, A.A., and Parrilo, P.A. (2009, January 15\u201318). A positive definite polynomial hessian that does not factor. Proceedings of the Joint 48th IEEE Conference on Decision and Control and 28th Chinese Control Conference, Shanghai, China.","DOI":"10.1109\/CDC.2009.5400519"},{"key":"ref_36","unstructured":"Wolfram, S. (1996). The Mathematica Book, Cambridge University Press. [3rd ed.]."},{"key":"ref_37","doi-asserted-by":"crossref","first-page":"2997","DOI":"10.1175\/2010MWR3164.1","article-title":"Beyond Gaussian statistical modeling in geophysical data assimilation","volume":"138","author":"Bocquet","year":"2010","journal-title":"Mon. Wea. Rev."},{"key":"ref_38","doi-asserted-by":"crossref","first-page":"1391","DOI":"10.1175\/JAS3408.1","article-title":"Multiplicative noise and non-Gaussianity: A paradigm for atmospheric regimes?","volume":"62","author":"Sura","year":"2005","journal-title":"J. Atmos. Sci."},{"key":"ref_39","doi-asserted-by":"crossref","first-page":"1261","DOI":"10.1109\/TIT.2005.844072","article-title":"Mutual information and minimum mean-square error in Gaussian channels","volume":"51","author":"Guo","year":"2005","journal-title":"IEEE Trans. Inform. Theor."},{"key":"ref_40","doi-asserted-by":"crossref","unstructured":"Rioul, O. (2007). A simple proof of the entropy-power inequality via properties of mutual information. arXiv, arXiv:cs\/0701050v2 [cs.IT].","DOI":"10.1109\/ISIT.2007.4557202"},{"key":"ref_41","doi-asserted-by":"crossref","first-page":"2371","DOI":"10.1109\/TIT.2011.2111010","article-title":"Estimation in gaussian noise: Properties of the minimum mean-square error","volume":"57","author":"Guo","year":"2011","journal-title":"IEEE T. Inform. Theory"},{"key":"ref_42","doi-asserted-by":"crossref","first-page":"407","DOI":"10.1007\/BF01589113","article-title":"Some numerical experiments with variable-storage quasi-Newton algorithms","volume":"45","author":"Gilbert","year":"1989","journal-title":"Math. Program."}],"container-title":["Entropy"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1099-4300\/14\/6\/1103\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T21:50:52Z","timestamp":1760219452000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1099-4300\/14\/6\/1103"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2012,6,19]]},"references-count":42,"journal-issue":{"issue":"6","published-online":{"date-parts":[[2012,6]]}},"alternative-id":["e14061103"],"URL":"https:\/\/doi.org\/10.3390\/e14061103","relation":{},"ISSN":["1099-4300"],"issn-type":[{"value":"1099-4300","type":"electronic"}],"subject":[],"published":{"date-parts":[[2012,6,19]]}}}