{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,2,5]],"date-time":"2026-02-05T07:08:48Z","timestamp":1770275328329,"version":"3.49.0"},"reference-count":26,"publisher":"SAGE Publications","issue":"6","license":[{"start":{"date-parts":[[2019,5,22]],"date-time":"2019-05-22T00:00:00Z","timestamp":1558483200000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/journals.sagepub.com\/page\/policies\/text-and-data-mining-license"}],"content-domain":{"domain":["journals.sagepub.com"],"crossmark-restriction":true},"short-container-title":["Journal of Intelligent &amp; Fuzzy Systems"],"published-print":{"date-parts":[[2019,6,11]]},"abstract":"<jats:p>\u00a0At present, manifold learning is mainly applied to dimensionality reduction. However, from viewpoint of dimensionality reduction, manifold learning algorithms are only local feature preserving algorithms. For example, Local Linear Embedding is local linear preserving, Local Tangent Space Alignment is local homeomorphic preserving and Laplacian Eigenmap is local similarity preserving. The community of dimensionality reduction is now pursuing the algorithms which can preserve both local and global features of data during dimensionality reduction. In this paper, a new algorithm of dimensionality reduction, called Hilbert-Schmidt Independence Criterion Regularized Manifold Learning (HSIC-ML for short), is proposed, in which HSIC between the high dimensional data and the dimension-reduced data is added as a regularization term to the objective functions of manifold learning. The addition of HSIC regularization term makes HSIC-ML capable of preserving both local and global features during dimensionality reduction. HSIC is a criterion measuring the statistical dependence between two data sets and has been widely applied to machine learning in recent years. However, since HSIC was first proposed around 2005, there seems to have not been applied directly to dimensionality reduction, not applied as a regularization term either. The proposed HSIC-ML may be the first try in this respect. The experimental results presented in this paper show that the manifold learning with HSIC regularization performs better than that without HSIC regularization.<\/jats:p>","DOI":"10.3233\/jifs-181379","type":"journal-article","created":{"date-parts":[[2019,5,24]],"date-time":"2019-05-24T10:53:10Z","timestamp":1558695190000},"page":"5547-5558","update-policy":"https:\/\/doi.org\/10.1177\/sage-journals-update-policy","source":"Crossref","is-referenced-by-count":3,"title":["HSIC regularized manifold learning"],"prefix":"10.1177","volume":"36","author":[{"given":"Xinghua","family":"Zheng","sequence":"first","affiliation":[{"name":"School of Data and Computer Science, Sun Yat-Sen University, Guangzhou, GuangDong, China"}]},{"given":"Zhengming","family":"Ma","sequence":"additional","affiliation":[{"name":"Nanfang College, Sun Yat-sen University, Guangzhou, China"}]},{"given":"Hanjian","family":"Che","sequence":"additional","affiliation":[{"name":"School of Electronics and Information Technology, Sun Yat-sen University, Guangzhou, China"}]},{"given":"Lei","family":"Li","sequence":"additional","affiliation":[{"name":"School of Data and Computer Science, Sun Yat-Sen University, Guangzhou, GuangDong, China"}]}],"member":"179","published-online":{"date-parts":[[2019,5,22]]},"reference":[{"key":"e_1_3_1_2_2","doi-asserted-by":"publisher","DOI":"10.1126\/science.290.5500.2323"},{"key":"e_1_3_1_3_2","doi-asserted-by":"publisher","DOI":"10.1137\/S1064827502419154"},{"key":"e_1_3_1_4_2","doi-asserted-by":"publisher","DOI":"10.1073\/pnas.1031596100"},{"issue":"6","key":"e_1_3_1_5_2","first-page":"585","article-title":"Laplacian eigenmaps and spectral techniques for embedding and clustering","volume":"14","author":"Belkin M.","year":"2001","unstructured":"BelkinM., NiyogiP., Laplacian eigenmaps and spectral techniques for embedding and clustering, Advances in Neural Information Processing Systems 14(6) (2001), 585\u2013591.","journal-title":"Advances in Neural Information Processing Systems"},{"issue":"1","key":"e_1_3_1_6_2","first-page":"186","article-title":"Locality preserving projections","volume":"16","author":"He X.","year":"2003","unstructured":"HeX., NiyogiP., Locality preserving projections, Advances in Neural Information Processing Systems 16(1) (2003), 186\u2013197.","journal-title":"Advances in Neural Information Processing Systems"},{"key":"e_1_3_1_7_2","doi-asserted-by":"publisher","DOI":"10.1109\/TSMCB.2012.2198916"},{"issue":"1","key":"e_1_3_1_8_2","first-page":"2399","article-title":"Manifold regularization: A geometric framework for learning from labeled and unlabeled examples","volume":"7","author":"Belkin M.","year":"2006","unstructured":"BelkinM., NiyogiP., SindhwaniV., Manifold regularization: A geometric framework for learning from labeled and unlabeled examples, Journal of Machine Learning Research 7(1) (2006), 2399\u20132434.","journal-title":"Journal of Machine Learning Research"},{"issue":"1","key":"e_1_3_1_9_2","first-page":"66","article-title":"Dimensionality reduction: A comparative review","volume":"9","author":"van der Maaten L.J.P.","year":"2009","unstructured":"van der MaatenL.J.P., PostmaE.O. and van den HerikH.J., Dimensionality reduction: A comparative review, J Mach Learn Res 9(1) (2009), 66\u201371.","journal-title":"J Mach Learn Res"},{"issue":"1","key":"e_1_3_1_10_2","first-page":"2859","article-title":"Linear dimensionality reduction: Survey, insights, and generalizations","volume":"16","author":"Cunningham J.P.","year":"2015","unstructured":"CunninghamJ.P., GhahramaniZ., Linear dimensionality reduction: Survey, insights, and generalizations, Journal of Machine Learning Research 16(1) (2015), 2859\u20132900.","journal-title":"Journal of Machine Learning Research"},{"key":"e_1_3_1_11_2","doi-asserted-by":"publisher","DOI":"10.1214\/12-STS406"},{"key":"e_1_3_1_12_2","first-page":"63","article-title":"Measuring statistical dependence with hilbert-schmidt norms","volume":"3734","author":"Gretton A.","year":"2005","unstructured":"GrettonA., BousquetO., SmolaA., ScholkopfB., Measuring statistical dependence with hilbert-schmidt norms, ALT, LNAI 3734 (2005), 63\u201378.","journal-title":"ALT, LNAI"},{"key":"e_1_3_1_13_2","doi-asserted-by":"publisher","DOI":"10.1162\/089976698300017467"},{"key":"e_1_3_1_14_2","first-page":"41","article-title":"Fisher discriminant analysis with kernels","author":"Mika S.","year":"1999","unstructured":"MikaS., R\u00e4tschG., WestonJ., Sch\u00f6lkopfB., M\u00fcllerK.R., Fisher discriminant analysis with kernels, Neural Networks for Signal Processing IX (1999), 41\u201348.","journal-title":"Neural Networks for Signal Processing IX"},{"key":"e_1_3_1_15_2","unstructured":"Sch\u00f6lkopfB. ChristopherJ.C. SmolaA. Advances in Kernel Methods: Support Vector Learning MIT Press Cambridge MA 1999."},{"key":"e_1_3_1_16_2","doi-asserted-by":"publisher","DOI":"10.1090\/S0002-9947-1950-0051437-7"},{"key":"e_1_3_1_17_2","doi-asserted-by":"publisher","DOI":"10.1109\/TGRS.2016.2642479"},{"key":"e_1_3_1_18_2","first-page":"388","volume-title":"Proceedings of the 30th International Conference on Machine Learning","author":"Zhang K.","year":"2013","unstructured":"ZhangK., ZhengV., WangQ., KwokJ., YangQ., MarsicI., Covariate Shift in Hilbert Space: A Solution via Sorrogate Kernels, Proceedings of the 30th International Conference on Machine Learning, 2013, pp. 388\u2013395."},{"key":"e_1_3_1_19_2","doi-asserted-by":"publisher","DOI":"10.1109\/TCBB.2016.2631164"},{"issue":"4","key":"e_1_3_1_20_2","first-page":"280","article-title":"Statistical data analysis based on the L1-Norm and related methods","volume":"6","author":"Dodge Y.","year":"2002","unstructured":"DodgeY., Statistical data analysis based on the L1-Norm and related methods, Computational Statistics and Data Analysis 6(4) (2002), 280\u2013282.","journal-title":"Computational Statistics and Data Analysis"},{"issue":"11","key":"e_1_3_1_21_2","first-page":"1393","article-title":"Feature selection via dependence maximization","volume":"13","author":"Song L.","year":"2012","unstructured":"SongL., SmolaA., GrettonA., BedoJ., BorgwardtK., Feature selection via dependence maximization, Journal of Machine Learning Research 13(11) (2012), 1393\u20131434.","journal-title":"Journal of Machine Learning Research"},{"key":"e_1_3_1_22_2","doi-asserted-by":"publisher","DOI":"10.1109\/LGRS.2010.2041896"},{"key":"e_1_3_1_23_2","doi-asserted-by":"publisher","DOI":"10.1109\/TSP.2013.2274276"},{"key":"e_1_3_1_24_2","doi-asserted-by":"publisher","DOI":"10.1109\/TASLP.2014.2319157"},{"key":"e_1_3_1_25_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.patcog.2010.12.015"},{"key":"e_1_3_1_26_2","doi-asserted-by":"publisher","DOI":"10.1109\/34.908974"},{"key":"e_1_3_1_27_2","doi-asserted-by":"publisher","DOI":"10.1109\/TCYB.2017.2655338"}],"container-title":["Journal of Intelligent &amp; Fuzzy Systems"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/journals.sagepub.com\/doi\/pdf\/10.3233\/JIFS-181379","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/journals.sagepub.com\/doi\/full-xml\/10.3233\/JIFS-181379","content-type":"application\/xml","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/journals.sagepub.com\/doi\/pdf\/10.3233\/JIFS-181379","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2026,2,4]],"date-time":"2026-02-04T18:19:15Z","timestamp":1770229155000},"score":1,"resource":{"primary":{"URL":"https:\/\/journals.sagepub.com\/doi\/10.3233\/JIFS-181379"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2019,5,22]]},"references-count":26,"journal-issue":{"issue":"6","published-print":{"date-parts":[[2019,6,11]]}},"alternative-id":["10.3233\/JIFS-181379"],"URL":"https:\/\/doi.org\/10.3233\/jifs-181379","relation":{},"ISSN":["1064-1246","1875-8967"],"issn-type":[{"value":"1064-1246","type":"print"},{"value":"1875-8967","type":"electronic"}],"subject":[],"published":{"date-parts":[[2019,5,22]]}}}