{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,4,25]],"date-time":"2026-04-25T05:55:39Z","timestamp":1777096539508,"version":"3.51.4"},"reference-count":35,"publisher":"MDPI AG","issue":"8","license":[{"start":{"date-parts":[[2023,8,7]],"date-time":"2023-08-07T00:00:00Z","timestamp":1691366400000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"name":"National Science and Technology Council, Taiwan","award":["NSTC 111-2221-E-019-047"],"award-info":[{"award-number":["NSTC 111-2221-E-019-047"]}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Entropy"],"abstract":"<jats:p>In this paper, we provide geometric insights with visualization into the multivariate Gaussian distribution and its entropy and mutual information. In order to develop the multivariate Gaussian distribution with entropy and mutual information, several significant methodologies are presented through the discussion, supported by illustrations, both technically and statistically. The paper examines broad measurements of structure for the Gaussian distributions, which show that they can be described in terms of the information theory between the given covariance matrix and correlated random variables (in terms of relative entropy). The content obtained allows readers to better perceive concepts, comprehend techniques, and properly execute software programs for future study on the topic\u2019s science and implementations. It also helps readers grasp the themes\u2019 fundamental concepts to study the application of multivariate sets of data in Gaussian distribution. The simulation results also convey the behavior of different elliptical interpretations based on the multivariate Gaussian distribution with entropy for real-world applications in our daily lives, including information coding, nonlinear signal detection, etc. Involving the relative entropy and mutual information as well as the potential correlated covariance analysis, a wide range of information is addressed, including basic application concerns as well as clinical diagnostics to detect the multi-disease effects.<\/jats:p>","DOI":"10.3390\/e25081177","type":"journal-article","created":{"date-parts":[[2023,8,8]],"date-time":"2023-08-08T12:26:39Z","timestamp":1691497599000},"page":"1177","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":13,"title":["Geometric Insights into the Multivariate Gaussian Distribution and Its Entropy and Mutual Information"],"prefix":"10.3390","volume":"25","author":[{"given":"Dah-Jing","family":"Jwo","sequence":"first","affiliation":[{"name":"Department of Communications, Navigation and Control Engineering, National Taiwan Ocean University, 2 Peining Rd., Keelung 202301, Taiwan"}]},{"given":"Ta-Shun","family":"Cho","sequence":"additional","affiliation":[{"name":"Department of Business Administration, Asia University, 500 Liufeng Road, Wufeng, Taichung 41354, Taiwan"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-2446-8436","authenticated-orcid":false,"given":"Amita","family":"Biswal","sequence":"additional","affiliation":[{"name":"Department of Communications, Navigation and Control Engineering, National Taiwan Ocean University, 2 Peining Rd., Keelung 202301, Taiwan"}]}],"member":"1968","published-online":{"date-parts":[[2023,8,7]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"1019","DOI":"10.1109\/18.57201","article-title":"On channel capacity per unit cost","volume":"36","year":"1990","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"1118","DOI":"10.1109\/18.995552","article-title":"Fading channels: How perfect need perfect side information be?","volume":"48","author":"Lapidoth","year":"2002","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"1319","DOI":"10.1109\/TIT.2002.1003824","article-title":"Spectral efficiency in the wideband regime","volume":"48","year":"2002","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"1567","DOI":"10.1109\/TIT.2004.831784","article-title":"Second-order asymptotics of mutual information","volume":"50","author":"Prelov","year":"2004","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"350","DOI":"10.1109\/TIT.1969.1054307","article-title":"A general likelihood-ratio formula for random signals in Gaussian noise","volume":"IT-15","author":"Kailath","year":"1969","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_6","doi-asserted-by":"crossref","first-page":"534","DOI":"10.1016\/S0019-9958(68)90960-1","article-title":"A note on least squares estimates from likelihood ratios","volume":"13","author":"Kailath","year":"1968","journal-title":"Inf. Control"},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"393","DOI":"10.1109\/TIT.1970.1054476","article-title":"A further note on a general likelihood formula for random signals in Gaussian noise","volume":"IT-16","author":"Kailath","year":"1970","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"46","DOI":"10.1016\/S0019-9958(72)90269-0","article-title":"On relations between detection and estimation of discrete time processes","volume":"20","author":"Jaffer","year":"1972","journal-title":"Inf. Control"},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"215","DOI":"10.1137\/0119020","article-title":"On the calculation of mutual information","volume":"19","author":"Duncan","year":"1970","journal-title":"SIAM J. Appl. Math."},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"368","DOI":"10.1109\/TIT.1971.1054670","article-title":"Mutual information of the white Gaussian channel with and without feedback","volume":"17","author":"Kadota","year":"1971","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_11","doi-asserted-by":"crossref","unstructured":"Amari, S.I. (2016). Information Geometry and Its Applications, Springer.","DOI":"10.1007\/978-4-431-55978-8"},{"key":"ref_12","doi-asserted-by":"crossref","unstructured":"Schneidman, E., Still, S., Berry, M.J., and Bialek, W. (2003). Network information and connected correlations. Phys. Rev. Lett., 91.","DOI":"10.1103\/PhysRevLett.91.238701"},{"key":"ref_13","doi-asserted-by":"crossref","first-page":"119","DOI":"10.1007\/s10827-013-0458-4","article-title":"Synergy, redundancy, and multivariate information measures: An experimentalist\u2019s perspective","volume":"36","author":"Timme","year":"2014","journal-title":"J. Comput. Neurosci."},{"key":"ref_14","doi-asserted-by":"crossref","first-page":"688","DOI":"10.1109\/18.30996","article-title":"Entropy expressions and their estimators for multivariate distributions","volume":"35","author":"Ahmed","year":"1989","journal-title":"IEEE Trans. Inform. Theory"},{"key":"ref_15","doi-asserted-by":"crossref","first-page":"324","DOI":"10.1016\/j.jmva.2003.10.003","article-title":"Estimation of the entropy of a multivariate normal distribution","volume":"92","author":"Misra","year":"2005","journal-title":"J. Multivar. Anal."},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"42","DOI":"10.1111\/j.1467-9469.2011.00774.x","article-title":"Shannon entropy and mutual information for multivariate skew-elliptical distributions","volume":"40","author":"Genton","year":"2013","journal-title":"Scand. J. Stat."},{"key":"ref_17","doi-asserted-by":"crossref","unstructured":"Liang, K.C., and Wang, X. (2008). Gene regulatory network reconstruction using conditional mutual information. EURASIP J. Bioinform. Syst. Biol., 2008.","DOI":"10.1155\/2008\/253894"},{"key":"ref_18","doi-asserted-by":"crossref","unstructured":"Novais, R.G., Wanke, P., Antunes, J., and Tan, Y. (2022). Portfolio optimization with a mean-entropy-mutual information model. Entropy, 24.","DOI":"10.3390\/e24030369"},{"key":"ref_19","doi-asserted-by":"crossref","unstructured":"Verd\u00fa, S. (2021). Error exponents and \u03b1-mutual information. Entropy, 23.","DOI":"10.3390\/e23020199"},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"1015","DOI":"10.1016\/j.mri.2008.02.019","article-title":"On the use of information theory for the analysis of the relationship between neural and imaging signals","volume":"26","author":"Panzeri","year":"2008","journal-title":"Magn. Reson. Imaging"},{"key":"ref_21","doi-asserted-by":"crossref","first-page":"18720","DOI":"10.1073\/pnas.1107583108","article-title":"Inferring the structure and dynamics of interactions in schooling fish","volume":"108","author":"Katz","year":"2011","journal-title":"Proc. Natl. Acad. Sci. USA"},{"key":"ref_22","doi-asserted-by":"crossref","unstructured":"Cutsuridis, V., Hussain, A., and Taylor, J.G. (2011). Perception-Action Cycle: Models, Architectures, and Hardware, Springer Science & Business Media.","DOI":"10.1007\/978-1-4419-1452-1"},{"key":"ref_23","doi-asserted-by":"crossref","first-page":"161","DOI":"10.1007\/s12064-011-0137-9","article-title":"Information-driven self-organization: The dynamical system approach to autonomous robot behavior","volume":"131","author":"Ay","year":"2012","journal-title":"Theory Biosci."},{"key":"ref_24","doi-asserted-by":"crossref","unstructured":"Rosas, F., Ntranos, V., Ellison, C.J., Pollin, S., and Verhelst, M. (2016). Understanding interdependency through complex information sharing. Entropy, 18.","DOI":"10.3390\/e18020038"},{"key":"ref_25","doi-asserted-by":"crossref","unstructured":"Ince, R.A. (2017). The Partial Entropy Decomposition: Decomposing multivariate entropy and mutual information via pointwise common surprisal. arXiv.","DOI":"10.3390\/e19070318"},{"key":"ref_26","doi-asserted-by":"crossref","unstructured":"Harder, M., Salge, C., and Polani, D. (2013). Bivariate measure of redundant information. Phys. Rev. E, 87.","DOI":"10.1103\/PhysRevE.87.012130"},{"key":"ref_27","doi-asserted-by":"crossref","unstructured":"Rauh, J., Banerjee, P.K., Olbrich, E., Jost, J., and Bertschinger, N. (2017). On extractable shared information. Entropy, 19.","DOI":"10.3390\/e19070328"},{"key":"ref_28","doi-asserted-by":"crossref","unstructured":"Ince, R.A. (2017). Measuring multivariate redundant information with pointwise common change in surprisal. Entropy, 19.","DOI":"10.3390\/e19070318"},{"key":"ref_29","doi-asserted-by":"crossref","unstructured":"Perrone, P., and Ay, N. (2016). Hierarchical quantification of synergy in channels. Front. Robot. AI, 2.","DOI":"10.3389\/frobt.2015.00035"},{"key":"ref_30","doi-asserted-by":"crossref","first-page":"2161","DOI":"10.3390\/e16042161","article-title":"Quantifying unique information","volume":"16","author":"Bertschinger","year":"2014","journal-title":"Entropy"},{"key":"ref_31","doi-asserted-by":"crossref","unstructured":"Chicharro, D., and Panzeri, S. (2017). Synergy and redundancy in dual decompositions of mutual information gain and information loss. Entropy, 19.","DOI":"10.3390\/e19020071"},{"key":"ref_32","doi-asserted-by":"crossref","unstructured":"Michalowicz, J.V., Nichols, J.M., and Bucholtz, F. (2008). Calculation of differential entropy for a mixed Gaussian distribution. Entropy, 10.","DOI":"10.3390\/entropy-e10030200"},{"key":"ref_33","doi-asserted-by":"crossref","unstructured":"Benish, W.A. (2020). A review of the application of information theory to clinical diagnostic testing. Entropy, 22.","DOI":"10.3390\/e22010097"},{"key":"ref_34","doi-asserted-by":"crossref","unstructured":"Cadirci, M.S., Evans, D., Leonenko, N., and Makogin, V. (2022). Entropy-based test for generalised Gaussian distributions. Comput. Stat. Data Anal., 173.","DOI":"10.1016\/j.csda.2022.107502"},{"key":"ref_35","doi-asserted-by":"crossref","unstructured":"Goethe, M., Fita, I., and Rubi, J.M. (2017). Testing the mutual information expansion of entropy with multivariate Gaussian distributions. J. Chem. Phys., 147.","DOI":"10.1063\/1.4996847"}],"container-title":["Entropy"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1099-4300\/25\/8\/1177\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,10]],"date-time":"2025-10-10T20:27:35Z","timestamp":1760128055000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1099-4300\/25\/8\/1177"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2023,8,7]]},"references-count":35,"journal-issue":{"issue":"8","published-online":{"date-parts":[[2023,8]]}},"alternative-id":["e25081177"],"URL":"https:\/\/doi.org\/10.3390\/e25081177","relation":{},"ISSN":["1099-4300"],"issn-type":[{"value":"1099-4300","type":"electronic"}],"subject":[],"published":{"date-parts":[[2023,8,7]]}}}