{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,10,12]],"date-time":"2025-10-12T02:44:09Z","timestamp":1760237049469,"version":"build-2065373602"},"reference-count":35,"publisher":"MDPI AG","issue":"2","license":[{"start":{"date-parts":[[2020,2,19]],"date-time":"2020-02-19T00:00:00Z","timestamp":1582070400000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Entropy"],"abstract":"<jats:p>A method for estimating the Shannon differential entropy of multidimensional random variables using independent samples is described. The method is based on decomposing the distribution into a product of marginal distributions and joint dependency, also known as the copula. The entropy of marginals is estimated using one-dimensional methods. The entropy of the copula, which always has a compact support, is estimated recursively by splitting the data along statistically dependent dimensions. The method can be applied both for distributions with compact and non-compact supports, which is imperative when the support is not known or of a mixed type (in different dimensions). At high dimensions (larger than 20), numerical examples demonstrate that our method is not only more accurate, but also significantly more efficient than existing approaches.<\/jats:p>","DOI":"10.3390\/e22020236","type":"journal-article","created":{"date-parts":[[2020,2,21]],"date-time":"2020-02-21T08:59:47Z","timestamp":1582275587000},"page":"236","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":13,"title":["Estimating Differential Entropy using Recursive Copula Splitting"],"prefix":"10.3390","volume":"22","author":[{"ORCID":"https:\/\/orcid.org\/0000-0002-7251-5383","authenticated-orcid":false,"given":"Gil","family":"Ariel","sequence":"first","affiliation":[{"name":"Department of Mathematics, Bar Ilan University, Ramat Gan 5290002, Israel"}]},{"given":"Yoram","family":"Louzoun","sequence":"additional","affiliation":[{"name":"Department of Mathematics, Bar Ilan University, Ramat Gan 5290002, Israel"}]}],"member":"1968","published-online":{"date-parts":[[2020,2,19]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"1667","DOI":"10.1109\/TPAMI.2002.1114861","article-title":"Input feature selection by mutual information based on parzen window","volume":"24","author":"Kwak","year":"2002","journal-title":"IEEE Trans. Pattern Anal. Mach. Intell."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"1168","DOI":"10.1016\/j.patrec.2009.11.010","article-title":"Textural feature selection by joint mutual information based on gaussian mixture model for multispectral image classification","volume":"31","author":"Kerroum","year":"2010","journal-title":"Pattern Recognit. Lett."},{"key":"ref_3","first-page":"25","article-title":"Feature selection for gene expression using model-based entropy","volume":"7","author":"Zhu","year":"2008","journal-title":"IEEE\/ACM Trans. Comput. Biol. Bioinform."},{"key":"ref_4","unstructured":"Faivishevsky, L., and Goldberger, J. (2008, January 8\u201310). ICA based on a smooth estimation of the differential entropy. Proceedings of the Advances in Neural Information Processing Systems 21 (NIPS 2008), Vancouver, BC, Canada."},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"68003","DOI":"10.1209\/0295-5075\/88\/68003","article-title":"An information-theoretic approach to statistical dependence: Copula information","volume":"88","author":"Calsaverini","year":"2009","journal-title":"Europhys. Lett."},{"key":"ref_6","doi-asserted-by":"crossref","first-page":"178102","DOI":"10.1103\/PhysRevLett.123.178102","article-title":"Universal and accessible entropy estimation using a compression algorithm","volume":"123","author":"Avinery","year":"2019","journal-title":"Phys. Rev. Lett."},{"key":"ref_7","first-page":"011031","article-title":"Quantifying hidden order out of equilibrium","volume":"9","author":"Martiniani","year":"2019","journal-title":"Phys. Rev. X"},{"key":"ref_8","first-page":"17","article-title":"Nonparametric entropy estimation: An overview","volume":"6","author":"Beirlant","year":"1997","journal-title":"Int. J. Math. Stat. Sci."},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"1191","DOI":"10.1162\/089976603321780272","article-title":"Estimation of entropy and mutual information","volume":"15","author":"Paninski","year":"2003","journal-title":"Neural Comput."},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"371","DOI":"10.1111\/j.1467-9892.1994.tb00200.x","article-title":"Using the mutual information coefficient to identify lags in nonlinear models","volume":"15","author":"Granger","year":"1994","journal-title":"J. Time Ser. Anal."},{"key":"ref_11","doi-asserted-by":"crossref","unstructured":"Sricharan, K., Raich, R., and Hero, A.O. (2010). Empirical estimation of entropy functionals with confidence. arXiv.","DOI":"10.1109\/ISIT.2011.6033726"},{"key":"ref_12","doi-asserted-by":"crossref","first-page":"1315","DOI":"10.1109\/18.761290","article-title":"Estimation of the information by an adaptive partitioning of the observation space","volume":"45","author":"Darbellay","year":"1999","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_13","doi-asserted-by":"crossref","first-page":"537","DOI":"10.1109\/LSP.2009.2017346","article-title":"Fast multidimensional entropy estimation by k-d partitioning","volume":"16","author":"Stowell","year":"2009","journal-title":"IEEE Signal Process. Lett."},{"key":"ref_14","first-page":"9","article-title":"Sample estimate of the entropy of a random vector","volume":"23","author":"Kozachenko","year":"1987","journal-title":"Probl. Peredachi Informatsii"},{"key":"ref_15","doi-asserted-by":"crossref","first-page":"066138","DOI":"10.1103\/PhysRevE.69.066138","article-title":"Estimating mutual information","volume":"69","author":"Kraskov","year":"2004","journal-title":"Phys. Rev. E"},{"key":"ref_16","doi-asserted-by":"crossref","unstructured":"Gao, W., Oh, S., and Viswanath, P. (2017, January 25\u201330). Density functional estimators with k-nearest neighbor bandwidths. Proceedings of the 2017 IEEE International Symposium on Information Theory (ISIT), Aachen, Germany.","DOI":"10.1109\/ISIT.2017.8006749"},{"key":"ref_17","doi-asserted-by":"crossref","first-page":"033114","DOI":"10.1063\/1.5011683","article-title":"Geometric k-nearest neighbor estimation of entropy and mutual information","volume":"28","author":"Lord","year":"2018","journal-title":"Chaos"},{"key":"ref_18","doi-asserted-by":"crossref","first-page":"683","DOI":"10.1007\/BF00057735","article-title":"Estimation of entropy and other functionals of a multivariate density","volume":"41","author":"Joe","year":"1989","journal-title":"Ann. Inst. Stat. Math."},{"key":"ref_19","first-page":"301","article-title":"Nearest neighbor estimates of entropy","volume":"23","author":"Singh","year":"2003","journal-title":"Am. J. Math. Manag. Sci."},{"key":"ref_20","doi-asserted-by":"crossref","first-page":"1045","DOI":"10.1016\/j.sigpro.2004.11.022","article-title":"Fast kernel entropy estimation and optimization","volume":"85","author":"Shwartz","year":"2005","journal-title":"Signal Process."},{"key":"ref_21","doi-asserted-by":"crossref","first-page":"1978","DOI":"10.1109\/TNN.2008.2006167","article-title":"Continuously differentiable sample-spacing entropy estimates","volume":"19","author":"Ozertem","year":"2008","journal-title":"IEEE Trans. Neural Netw."},{"key":"ref_22","unstructured":"Gao, W., Oh, S., and Viswanath, P. (2016, January 5\u201310). Breaking the bandwidth barrier: Geometrical adaptive entropy estimation. Proceedings of the Advances in Neural Information Processing Systems 29 (NIPS 2016), Barcelona, Spain."},{"key":"ref_23","unstructured":"Indyk, P., Kleinberg, R., Mahabadi, S., and Yuan, Y. (2016). Simultaneous nearest neighbor search. arXiv."},{"key":"ref_24","unstructured":"Miller, E.G. (2003, January 6\u201310). A new class of entropy estimators for multi-dimensional densities. Proceedings of the 2003 IEEE International Conference on Acoustics, Speech, and Signal Processing, Hong Kong, China."},{"key":"ref_25","doi-asserted-by":"crossref","first-page":"4374","DOI":"10.1109\/TIT.2013.2251456","article-title":"Ensemble estimators for multivariate entropy estimation","volume":"59","author":"Sricharan","year":"2013","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_26","doi-asserted-by":"crossref","unstructured":"Jaworski, P., Durante, F., Hardle, W.K., and Rychlik, T. (2010). Copula Theory and Its Applications, Springer.","DOI":"10.1007\/978-3-642-12465-5"},{"key":"ref_27","doi-asserted-by":"crossref","unstructured":"Durante, F., and Sempi, C. (2010). Copula theory: An introduction. Copula Theory and Its Applications, Springer.","DOI":"10.1007\/978-3-642-12465-5_1"},{"key":"ref_28","doi-asserted-by":"crossref","first-page":"5154","DOI":"10.3390\/e15125154","article-title":"Non\u2013parametric estimation of mutual information through the entropy of the linkage","volume":"15","author":"Giraudo","year":"2013","journal-title":"Entropy"},{"key":"ref_29","doi-asserted-by":"crossref","first-page":"2253","DOI":"10.3390\/e17042253","article-title":"Integrating entropy and copula theories for hydrologic modeling and analysis","volume":"17","author":"Hao","year":"2015","journal-title":"Entropy"},{"key":"ref_30","first-page":"887","article-title":"Transfer entropy estimation via copula","volume":"138","author":"Xue","year":"2017","journal-title":"Adv. Eng. Res."},{"key":"ref_31","doi-asserted-by":"crossref","first-page":"81","DOI":"10.1017\/asb.2013.6","article-title":"Statistical inference for copulas in high dimensions: A simulation study","volume":"43","author":"Embrechts","year":"2013","journal-title":"Astin Bull. J. IAA"},{"key":"ref_32","unstructured":"Dan Stowell (2020, February 16). k-d Partitioning Entropy Estimator: A Fast Estimator for the Entropy of Multidimensional Data Distributions. Available online: https:\/\/github.com\/danstowell\/kdpee."},{"key":"ref_33","unstructured":"Kalle Rutanen (2020, February 16). TIM, A C++ Library for Efficient Estimation of Information-Theoretic Measures from Time-Series\u2019 in Arbitrary Dimensions. Available online: https:\/\/kaba.hilvi.org\/homepage\/main.htm."},{"key":"ref_34","unstructured":"Knuth, K.H. (2006). Optimal data-based binning for histograms. arXiv."},{"key":"ref_35","doi-asserted-by":"crossref","first-page":"813","DOI":"10.1093\/biomet\/asx050","article-title":"Distribution-free tests of independence in high dimensions","volume":"104","author":"Han","year":"2017","journal-title":"Biometrika"}],"container-title":["Entropy"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1099-4300\/22\/2\/236\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T08:59:14Z","timestamp":1760173154000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1099-4300\/22\/2\/236"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2020,2,19]]},"references-count":35,"journal-issue":{"issue":"2","published-online":{"date-parts":[[2020,2]]}},"alternative-id":["e22020236"],"URL":"https:\/\/doi.org\/10.3390\/e22020236","relation":{},"ISSN":["1099-4300"],"issn-type":[{"type":"electronic","value":"1099-4300"}],"subject":[],"published":{"date-parts":[[2020,2,19]]}}}