{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,10,12]],"date-time":"2025-10-12T03:45:41Z","timestamp":1760240741619,"version":"build-2065373602"},"reference-count":32,"publisher":"MDPI AG","issue":"9","license":[{"start":{"date-parts":[[2019,9,2]],"date-time":"2019-09-02T00:00:00Z","timestamp":1567382400000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"DOI":"10.13039\/501100001809","name":"National Natural Science Foundation of China","doi-asserted-by":"publisher","award":["91648204"],"award-info":[{"award-number":["91648204"]}],"id":[{"id":"10.13039\/501100001809","id-type":"DOI","asserted-by":"publisher"}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Entropy"],"abstract":"<jats:p>Kernels play a crucial role in Gaussian process regression. Analyzing kernels from their spectral domain has attracted extensive attention in recent years. Gaussian mixture models (GMM) are used to model the spectrum of kernels. However, the number of components in a GMM is fixed. Thus, this model suffers from overfitting or underfitting. In this paper, we try to combine the spectral domain of kernels with nonparametric Bayesian models. Dirichlet processes mixture models are used to resolve this problem by changing the number of components according to the data size. Multiple experiments have been conducted on this model and it shows competitive performance.<\/jats:p>","DOI":"10.3390\/e21090857","type":"journal-article","created":{"date-parts":[[2019,9,3]],"date-time":"2019-09-03T03:06:14Z","timestamp":1567479974000},"page":"857","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":1,"title":["Kernel Analysis Based on Dirichlet Processes Mixture Models"],"prefix":"10.3390","volume":"21","author":[{"given":"Jinkai","family":"Tian","sequence":"first","affiliation":[{"name":"State Key Laboratory of High Performance Computing, National University of Defense Technology, Changsha 410073, China"},{"name":"College of Computer, National University of Defense Technology, Changsha 410073, China"}]},{"given":"Peifeng","family":"Yan","sequence":"additional","affiliation":[{"name":"State Key Laboratory of High Performance Computing, National University of Defense Technology, Changsha 410073, China"},{"name":"College of Computer, National University of Defense Technology, Changsha 410073, China"}]},{"given":"Da","family":"Huang","sequence":"additional","affiliation":[{"name":"State Key Laboratory of High Performance Computing, National University of Defense Technology, Changsha 410073, China"},{"name":"College of Computer, National University of Defense Technology, Changsha 410073, China"}]}],"member":"1968","published-online":{"date-parts":[[2019,9,2]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","first-page":"209","DOI":"10.1214\/aos\/1176342360","article-title":"A Bayesian analysis of some nonparametric problems","volume":"1","author":"Ferguson","year":"1973","journal-title":"Ann. Stat."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"161","DOI":"10.1198\/016214501750332758","article-title":"Gibbs sampling methods for stick-breaking priors","volume":"96","author":"Ishwaran","year":"2001","journal-title":"J. Am. Stat. Assoc."},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"353","DOI":"10.1214\/aos\/1176342372","article-title":"Ferguson distributions via P\u00f3lya urn schemes","volume":"1","author":"Blackwell","year":"1973","journal-title":"Ann. Stat."},{"key":"ref_4","unstructured":"Pitman, J. (2002). Combinatorial Stochastic Processes, Department Statistics, University of California. Technical Report 621."},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"1","DOI":"10.1007\/BFb0099421","article-title":"Exchangeability and related topics","volume":"1117","author":"Aldous","year":"1985","journal-title":"\u00c9cole d\u2019\u00c9t\u00e9 de Probabilit\u00e9s de Saint-Flour XIII\u20141983"},{"key":"ref_6","unstructured":"Sudderth, E.B., and Jordan, M.I. (2009). Shared segmentation of natural scenes using dependent Pitman-Yor processes. Advances in Neural Information Processing Systems 21, Curran Associates, Inc."},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"1021","DOI":"10.1198\/016214504000002078","article-title":"Bayesian nonparametric spatial modeling with Dirichlet process mixing","volume":"100","author":"Gelfand","year":"2005","journal-title":"J. Am. Stat. Assoc."},{"key":"ref_8","unstructured":"Liang, P., Petrov, S., Jordan, M., and Klein, D. (2007, January 28\u201330). The infinite PCFG using hierarchical Dirichlet processes. Proceedings of the 2007 Joint Conference on Empirical Methods In Natural Language Processing and Computational Natural Language Learning (EMNLP-CoNLL), Prague, Czech Republic."},{"key":"ref_9","doi-asserted-by":"crossref","unstructured":"Johnson, M., Griffiths, T.L., and Goldwater, S. (2007). Adaptor grammars: A framework for specifying compositional nonparametric Bayesian models. Advances in Neural Information Processing Systems, Curran Associates, Inc.","DOI":"10.7551\/mitpress\/7503.003.0085"},{"key":"ref_10","doi-asserted-by":"crossref","first-page":"2597","DOI":"10.1162\/neco.2008.04-07-504","article-title":"Latent features in similarity judgments: A nonparametric Bayesian approach","volume":"20","author":"Navarro","year":"2008","journal-title":"Neural Comput."},{"key":"ref_11","unstructured":"Kemp, C., Tenenbaum, J.B., Griffiths, T.L., Yamada, T., and Ueda, N. (2006, January 16\u201320). Learning systems of concepts with an infinite relational model. Proceedings of the AAAI\u201906 21st National Conference on Artificial Intelligence, Boston, MA, USA."},{"key":"ref_12","unstructured":"Fox, E., Sudderth, E.B., Jordan, M.I., and Willsky, A.S. (2009). Nonparametric Bayesian learning of switching linear dynamical systems. Advances in Neural Information Processing Systems, Curran Associates, Inc."},{"key":"ref_13","unstructured":"Fox, E., Jordan, M.I., Sudderth, E.B., and Willsky, A.S. (2009). Sharing features among dynamical systems with beta processes. Advances in Neural Information Processing Systems, Curran Associates, Inc."},{"key":"ref_14","doi-asserted-by":"crossref","first-page":"3905","DOI":"10.1109\/TSP.2009.2024987","article-title":"Hidden Markov models with stick-breaking priors","volume":"57","author":"Paisley","year":"2009","journal-title":"IEEE Trans. Signal Process."},{"key":"ref_15","doi-asserted-by":"crossref","unstructured":"Beal, M.J., Ghahramani, Z., and Rasmussen, C.E. (2002). The infinite hidden Markov model. Advances in Neural Information Processing Systems, Curran Associates, Inc.","DOI":"10.7551\/mitpress\/1120.003.0079"},{"key":"ref_16","unstructured":"Neal, R.M. (2012). Bayesian Learning for Neural Networks, Springer Science & Business Media."},{"key":"ref_17","first-page":"63","article-title":"Gaussian processes in machine learning","volume":"3176","author":"Rasmussen","year":"2003","journal-title":"Adv. Lect. Mach. Learn."},{"key":"ref_18","unstructured":"Salimbeni, H., and Deisenroth, M. (2017). Doubly stochastic variational inference for deep Gaussian processes. Advances in Neural Information Processing Systems, Curran Associates, Inc."},{"key":"ref_19","unstructured":"Wilson, A.G., Hu, Z., Salakhutdinov, R.R., and Xing, E.P. (2016, January 5\u201310). Stochastic Variational Deep Kernel Learning. Proceedings of the NIPS\u201916 30th International Conference on Neural Information Processing Systems, Barcelona, Spain."},{"key":"ref_20","unstructured":"Yang, Z., Wilson, A., Smola, A., and Song, L. (2015, January 9\u201312). A La Carte\u2013Learning Fast Kernels. Proceedings of the 18th International Conference on Artificial Intelligence and Statistics, San Diego, CA, USA."},{"key":"ref_21","unstructured":"Damianou, A., and Lawrence, N. (May, January 29). Deep gaussian processes. Proceedings of the 16th International Conference on Artificial Intelligence and Statistics, Scottsdale, AZ, USA."},{"key":"ref_22","unstructured":"Hinton, G.E., and Salakhutdinov, R.R. (2007, January 3\u20136). Using deep belief nets to learn covariance kernels for Gaussian processes. NIPS\u201907 Proceedings of the 20th International Conference on Neural Information Processing Systems, Vancouver, BC, Canda."},{"key":"ref_23","first-page":"2211","article-title":"Multiple kernel learning algorithms","volume":"12","year":"2011","journal-title":"J. Mach. Learn. Res."},{"key":"ref_24","unstructured":"Duvenaud, D., Lloyd, J.R., Grosse, R., Tenenbaum, J.B., and Ghahramani, Z. (2013). Structure discovery in nonparametric regression through compositional kernel search. arXiv."},{"key":"ref_25","unstructured":"Wilson, A., and Adams, R. (2013, January 16\u201321). Gaussian process kernels for pattern discovery and extrapolation. Proceedings of the 30th International Conference on Machine Learning, Atlanta, GA, USA."},{"key":"ref_26","doi-asserted-by":"crossref","first-page":"1152","DOI":"10.1214\/aos\/1176342871","article-title":"Mixtures of Dirichlet processes with applications to Bayesian nonparametric problems","volume":"2","author":"Antoniak","year":"1974","journal-title":"Ann. Stat."},{"key":"ref_27","first-page":"16","article-title":"Conjugate Bayesian analysis of the Gaussian distribution","volume":"1","author":"Murphy","year":"2007","journal-title":"def"},{"key":"ref_28","unstructured":"Stein, M.L. (2012). Interpolation of Spatial Data: Some Theory for Kriging, Springer Science & Business Media."},{"key":"ref_29","doi-asserted-by":"crossref","unstructured":"Cox, D.R. (2017). The Theory of Stochastic Processes, Routledge.","DOI":"10.1201\/9780203719152"},{"key":"ref_30","doi-asserted-by":"crossref","first-page":"1289","DOI":"10.1109\/TIT.2006.871582","article-title":"Compressed sensing","volume":"52","author":"Donoho","year":"2006","journal-title":"IEEE Trans. Inf. Theory"},{"key":"ref_31","doi-asserted-by":"crossref","first-page":"705","DOI":"10.1214\/aos\/1056562461","article-title":"Slice sampling","volume":"31","author":"Neal","year":"2003","journal-title":"Ann. Stat."},{"key":"ref_32","first-page":"337","article-title":"Adaptive rejection sampling for Gibbs sampling","volume":"41","author":"Gilks","year":"1992","journal-title":"J. R. Stat. Soc. C"}],"container-title":["Entropy"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1099-4300\/21\/9\/857\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T13:16:01Z","timestamp":1760188561000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1099-4300\/21\/9\/857"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2019,9,2]]},"references-count":32,"journal-issue":{"issue":"9","published-online":{"date-parts":[[2019,9]]}},"alternative-id":["e21090857"],"URL":"https:\/\/doi.org\/10.3390\/e21090857","relation":{},"ISSN":["1099-4300"],"issn-type":[{"type":"electronic","value":"1099-4300"}],"subject":[],"published":{"date-parts":[[2019,9,2]]}}}