{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,2,10]],"date-time":"2026-02-10T00:48:16Z","timestamp":1770684496877,"version":"3.49.0"},"reference-count":23,"publisher":"SAGE Publications","issue":"1","license":[{"start":{"date-parts":[[2015,8,31]],"date-time":"2015-08-31T00:00:00Z","timestamp":1440979200000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/journals.sagepub.com\/page\/policies\/text-and-data-mining-license"}],"content-domain":{"domain":["journals.sagepub.com"],"crossmark-restriction":true},"short-container-title":["Journal of Intelligent &amp; Fuzzy Systems"],"published-print":{"date-parts":[[2015,8,31]]},"abstract":"<jats:p>\n                    Fast Reduced set density estimator (FRSDE) is an important technique to realize the fast kernel density estimation based on the fast minimal enclosing ball (MEB) approximation technique. However, its performance on the running time is severely affected by the approximation parameter\n                    <jats:italic>\u025b<\/jats:italic>\n                    used in this algorithm, where a smaller value will lead to more accurate approximation but heavy learning burden. In this study, we reveal that the random Gaussian white noise manually added to the data will speed up the learning and accordingly propose a speedup version of FRSDE, i.e., the noise-benefit FRSDE (NB-FRSDE). NB-FRSDE can realize such a speedup because a larger value of\n                    <jats:italic>\u025b<\/jats:italic>\n                    can be used on the noisy version of the original data to obtain the equivalent approximation performance, which only can be obtained by FRSDE on the original data with a smaller value of\n                    <jats:italic>\u025b<\/jats:italic>\n                    . The distinctive characteristics of NB-FRSDE exist in the following aspects: (1) its implementation is very simple because NB-FRSDE is the same as FRSDE except that there are Gaussian noises manually added to the original data in NB-FRSDE. (2) While most of the existing machine learning methods always try to remove the noise in order to overcome the influence of noise, NB-FRSDE benefits from the manually added noise in the sense of the average running time. The experimental studies on density estimation and its application to image segmentation demonstrate the above advantages.\n                  <\/jats:p>","DOI":"10.3233\/ifs-151768","type":"journal-article","created":{"date-parts":[[2016,1,15]],"date-time":"2016-01-15T12:23:55Z","timestamp":1452860635000},"page":"443-450","update-policy":"https:\/\/doi.org\/10.1177\/sage-journals-update-policy","source":"Crossref","is-referenced-by-count":0,"title":["Noise-benefit FRSDE for speedup of density estimation on large data"],"prefix":"10.1177","volume":"30","author":[{"given":"Guanjin","family":"Wang","sequence":"first","affiliation":[{"name":"School of Nursing, Hong Kong Polytechnic University, Hong Kong, China"},{"name":"Centre for Smart Health, School of Nursing, Hong Kong Polytechnic University, Hong Kong, China"}]},{"given":"Kup-Sze","family":"Choi","sequence":"additional","affiliation":[{"name":"School of Nursing, Hong Kong Polytechnic University, Hong Kong, China"},{"name":"Centre for Smart Health, School of Nursing, Hong Kong Polytechnic University, Hong Kong, China"}]},{"given":"Zhaohong","family":"Deng","sequence":"additional","affiliation":[{"name":"School of Digital Media, JiangNan University, Wuxi, Jiangsu, China"}]}],"member":"179","published-online":{"date-parts":[[2015,9,9]]},"reference":[{"key":"e_1_3_2_2_2","unstructured":"SilvermanB.W. Density Estimation for Statistics and Data Analysis Chapman and Hall CRC press 1986."},{"key":"e_1_3_2_3_2","doi-asserted-by":"crossref","unstructured":"BishopC. Neural Networks for Pattern Recognition Oxford Univ Press 1995.","DOI":"10.1093\/oso\/9780198538493.001.0001"},{"key":"e_1_3_2_4_2","doi-asserted-by":"crossref","first-page":"1065","DOI":"10.1214\/aoms\/1177704472","article-title":"On estimation of a probability density function and mode","volume":"33","author":"Parzen E.","year":"1962","unstructured":"ParzenE., On estimation of a probability density function and mode, Annals of Math Statistics 33 (1962), 1065\u20131076.","journal-title":"Annals of Math Statistics"},{"key":"e_1_3_2_5_2","article-title":"Support vector method for multivariate density estimation, center for biological and computational learning","volume":"170","author":"Mukherjee S.","year":"1999","unstructured":"MukherjeeS. and VapnikV., Support vector method for multivariate density estimation, center for biological and computational learning, Department of Brain and Cognitive Sciences, MIT 170 (1999).","journal-title":"Department of Brain and Cognitive Sciences, MIT"},{"key":"e_1_3_2_6_2","doi-asserted-by":"crossref","first-page":"1353","DOI":"10.1080\/03610928508828980","article-title":"Kernel density estimation with binned data","volume":"14","author":"Scott D.W.","year":"1985","unstructured":"ScottD.W. and SheatherS.J., Kernel density estimation with binned data, Comm Statistics-Theory and Methods 14 (1985), 1353\u20131359.","journal-title":"Comm Statistics-Theory and Methods"},{"key":"e_1_3_2_7_2","unstructured":"VapnikV. and MukherjeeS. Support Vector Method for Multivariate Density Estimation Advances in Neural Information Processing Systems MIT Press 2000 659\u2013665."},{"key":"e_1_3_2_8_2","doi-asserted-by":"crossref","unstructured":"Holnstr\u00f6mL. The Error and the Computational Complexity of a Multivariate Binned Kernel Density Estimator J. Multivariate Analysis 72(2) (2000) 264\u2013309.","DOI":"10.1006\/jmva.1999.1863"},{"issue":"9","key":"e_1_3_2_9_2","doi-asserted-by":"crossref","first-page":"950","DOI":"10.1109\/34.310693","article-title":"Fast parzen density estimation using clustering-based branch and bound","volume":"16","author":"Jeon B.","year":"1994","unstructured":"JeonB. and LandgrebeD.A., Fast parzen density estimation using clustering-based branch and bound, IEEE Trans Pattern Analysis and Machine Intelligence 16(9) (1994), 950\u2013954.","journal-title":"IEEE Trans Pattern Analysis and Machine Intelligence"},{"issue":"10","key":"e_1_3_2_10_2","doi-asserted-by":"crossref","first-page":"1253","DOI":"10.1109\/TPAMI.2003.1233899","article-title":"Probability density estimation from optimally condensed data samples","volume":"25","author":"Girolami M.","year":"2003","unstructured":"GirolamiM. and HeC., Probability density estimation from optimally condensed data samples, IEEE Transactions on Pattern Analysis and Machine Intelligence 25(10) (2003), 1253\u20131264.","journal-title":"IEEE Transactions on Pattern Analysis and Machine Intelligence"},{"issue":"4","key":"e_1_3_2_11_2","doi-asserted-by":"crossref","first-page":"1363","DOI":"10.1016\/j.patcog.2007.09.013","article-title":"FRSDE: Fast reduced set density estimator using minimal enclosing ball approximation","volume":"41","author":"Deng Z.H.","year":"2008","unstructured":"DengZ.H., ChungF.L. and WangS.T., FRSDE: Fast reduced set density estimator using minimal enclosing ball approximation, Pattern Recognition 41(4) (2008), 1363\u20131372.","journal-title":"Pattern Recognition"},{"issue":"5","key":"e_1_3_2_12_2","doi-asserted-by":"crossref","first-page":"1126","DOI":"10.1109\/TNN.2006.878123","article-title":"Generalized core vector machines","volume":"17","author":"Tsang I.W.","year":"2006","unstructured":"TsangI.W., KwokJ.T. and ZuradaJ.M., Generalized core vector machines, IEEE Transactions on Neural Networks 17(5) (2006), 1126\u20131140.","journal-title":"IEEE Transactions on Neural Networks"},{"key":"e_1_3_2_13_2","unstructured":"B\u00e1diuM. and ClarksonK.L. Optimal core sets for balls In DIMACS Workshop on Computational Geometry 2002."},{"key":"e_1_3_2_14_2","first-page":"250","article-title":"Approximate clustering via core sets","author":"B\u00e1diu M.","year":"2002","unstructured":"B\u00e1diuM., Har-PeledS. and IndykP., Approximate clustering via core sets, In Proceedings of 34th Annual ACM Symposium on Theory of Computing, 2002, pp. 250\u2013257.","journal-title":"In Proceedings of 34th Annual ACM Symposium on Theory of Computing"},{"issue":"4","key":"e_1_3_2_15_2","doi-asserted-by":"crossref","first-page":"563","DOI":"10.1007\/s11704-014-3105-y","article-title":"Dm-KDE: Dynamical kernel density estimation by sequences of KDE estimators with fixed number of components over data streams","volume":"8","author":"Xu M.","year":"2014","unstructured":"XuM., IshibuchiH., GuX. and WangS., Dm-KDE: Dynamical kernel density estimation by sequences of KDE estimators with fixed number of components over data streams, Frontiers of Computer Science 8(4) (2014), 563\u2013580.","journal-title":"Frontiers of Computer Science"},{"issue":"2","key":"e_1_3_2_16_2","doi-asserted-by":"crossref","first-page":"157","DOI":"10.3233\/IDA-140635","article-title":"Density estimation of high dimensional data using ICA and Bayesian networks","volume":"18","author":"Abdenebi R.","year":"2014","unstructured":"AbdenebiR., ChitroubS. and BouridaneA., Density estimation of high dimensional data using ICA and Bayesian networks, Intelligent Data Analysis 18(2) (2014), 157\u2013179.","journal-title":"Intelligent Data Analysis"},{"issue":"1","key":"e_1_3_2_17_2","doi-asserted-by":"crossref","first-page":"29","DOI":"10.1007\/s10994-012-5298-3","article-title":"Density estimation with minimization of U-divergence","volume":"90","author":"Kanta N.","year":"2013","unstructured":"KantaN. and EguchiS., Density estimation with minimization of U-divergence, Machine Learning 90(1) (2013), 29\u201357.","journal-title":"Machine Learning"},{"key":"e_1_3_2_18_2","doi-asserted-by":"publisher","DOI":"10.1109\/TSMCB.2012.2236828"},{"issue":"3","key":"e_1_3_2_19_2","doi-asserted-by":"crossref","first-page":"355","DOI":"10.1109\/TCYB.2013.2255983","article-title":"Online discriminative kernel density estimator with gaussian kernels","volume":"44","author":"Matej K.","year":"2014","unstructured":"MatejK. and LeonardisA., Online discriminative kernel density estimator with gaussian kernels, IEEE Transactions on Cybernetics 44(3) (2014), 355\u2013365.","journal-title":"IEEE Transactions on Cybernetics"},{"key":"e_1_3_2_20_2","doi-asserted-by":"crossref","first-page":"122","DOI":"10.1016\/j.neucom.2013.02.003","article-title":"Sparse probability density function estimation using the minimum integrated square error","volume":"115","author":"Honga X.","year":"2013","unstructured":"HongaX., ChenbS., QatawnehcA., DaqrouqcK., SheikhcM. and MorfeqcA., Sparse probability density function estimation using the minimum integrated square error, Neurocomputing 115 (2013), 122\u2013129.","journal-title":"Neurocomputing"},{"key":"e_1_3_2_21_2","doi-asserted-by":"publisher","DOI":"10.1109\/TFUZZ.2010.2091961"},{"issue":"1","key":"e_1_3_2_22_2","doi-asserted-by":"crossref","first-page":"173","DOI":"10.1109\/TFUZZ.2008.2006620","article-title":"From minimum enclosing ball to fast fuzzy inference system training on large datasets","volume":"17","author":"Chung F.L.","year":"2009","unstructured":"ChungF.L., DengZ.H. and WangS., From minimum enclosing ball to fast fuzzy inference system training on large datasets, IEEE Transactions on Fuzzy Systems 17(1) (2009), 173\u2013184.","journal-title":"IEEE Transactions on Fuzzy Systems"},{"key":"e_1_3_2_23_2","unstructured":"YanP. ZhangC. Pattern Recognition Beijing: Tsinghua University Press 2000."},{"issue":"4","key":"e_1_3_2_24_2","doi-asserted-by":"crossref","first-page":"444","DOI":"10.1109\/TFUZZ.2004.841748","article-title":"Image segmentation based on adaptive cluster prototype estimation","volume":"13","author":"Liew A.W.C.","year":"2005","unstructured":"LiewA.W.C., YanH. and LawN.F., Image segmentation based on adaptive cluster prototype estimation, IEEE Transactions on Fuzzy Systems 13(4) (2005), 444\u2013453.","journal-title":"IEEE Transactions on Fuzzy Systems"}],"container-title":["Journal of Intelligent &amp; Fuzzy Systems"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/journals.sagepub.com\/doi\/pdf\/10.3233\/IFS-151768","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/journals.sagepub.com\/doi\/full-xml\/10.3233\/IFS-151768","content-type":"application\/xml","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/journals.sagepub.com\/doi\/pdf\/10.3233\/IFS-151768","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2026,2,9]],"date-time":"2026-02-09T07:27:36Z","timestamp":1770622056000},"score":1,"resource":{"primary":{"URL":"https:\/\/journals.sagepub.com\/doi\/10.3233\/IFS-151768"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2015,8,31]]},"references-count":23,"journal-issue":{"issue":"1","published-print":{"date-parts":[[2015,8,31]]}},"alternative-id":["10.3233\/IFS-151768"],"URL":"https:\/\/doi.org\/10.3233\/ifs-151768","relation":{},"ISSN":["1064-1246","1875-8967"],"issn-type":[{"value":"1064-1246","type":"print"},{"value":"1875-8967","type":"electronic"}],"subject":[],"published":{"date-parts":[[2015,8,31]]}}}