{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,4,4]],"date-time":"2026-04-04T05:11:55Z","timestamp":1775279515140,"version":"3.50.1"},"reference-count":21,"publisher":"MDPI AG","issue":"11","license":[{"start":{"date-parts":[[2018,11,20]],"date-time":"2018-11-20T00:00:00Z","timestamp":1542672000000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Future Internet"],"abstract":"<jats:p>Text classification is of importance in natural language processing, as the massive text information containing huge amounts of value needs to be classified into different categories for further use. In order to better classify text, our paper tries to build a deep learning model which achieves better classification results in Chinese text than those of other researchers\u2019 models. After comparing different methods, long short-term memory (LSTM) and convolutional neural network (CNN) methods were selected as deep learning methods to classify Chinese text. LSTM is a special kind of recurrent neural network (RNN), which is capable of processing serialized information through its recurrent structure. By contrast, CNN has shown its ability to extract features from visual imagery. Therefore, two layers of LSTM and one layer of CNN were integrated to our new model: the BLSTM-C model (BLSTM stands for bi-directional long short-term memory while C stands for CNN.) LSTM was responsible for obtaining a sequence output based on past and future contexts, which was then input to the convolutional layer for extracting features. In our experiments, the proposed BLSTM-C model was evaluated in several ways. In the results, the model exhibited remarkable performance in text classification, especially in Chinese texts.<\/jats:p>","DOI":"10.3390\/fi10110113","type":"journal-article","created":{"date-parts":[[2018,11,21]],"date-time":"2018-11-21T11:23:27Z","timestamp":1542799407000},"page":"113","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":78,"title":["Chinese Text Classification Model Based on Deep Learning"],"prefix":"10.3390","volume":"10","author":[{"ORCID":"https:\/\/orcid.org\/0000-0003-4539-4018","authenticated-orcid":false,"given":"Yue","family":"Li","sequence":"first","affiliation":[{"name":"School of Computer Science and Technology, Donghua University, Shanghai 201620, China"}]},{"given":"Xutao","family":"Wang","sequence":"additional","affiliation":[{"name":"School of Computer Science and Technology, Donghua University, Shanghai 201620, China"}]},{"given":"Pengjian","family":"Xu","sequence":"additional","affiliation":[{"name":"School of Computer Science and Technology, Donghua University, Shanghai 201620, China"}]}],"member":"1968","published-online":{"date-parts":[[2018,11,20]]},"reference":[{"key":"ref_1","unstructured":"Wang, S., and Manning, C.D. (2012, January 8\u201314). Aselines and bigrams: Simple, good sentiment and topic classification. Proceedings of the 50th Annual Meeting of the Association for Computational Linguistics, Jeju Island, Korea."},{"key":"ref_2","doi-asserted-by":"crossref","unstructured":"Socher, R., Perelygin, A., Wu, J.Y., Chuang, J., Manning, C.D., Ng, A.Y., and Potts, C. (2013, January 18\u201321). Recursive deep models for semantic compositionality over a sentiment treebank. Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, Seattle, WA, USA.","DOI":"10.18653\/v1\/D13-1170"},{"key":"ref_3","doi-asserted-by":"crossref","first-page":"1735","DOI":"10.1162\/neco.1997.9.8.1735","article-title":"Long short-term memory","volume":"9","author":"Hochreiter","year":"1997","journal-title":"Neural Comput."},{"key":"ref_4","doi-asserted-by":"crossref","unstructured":"Cho, K., Merrienboer, B.V., Gulcehre, C., Bahdanau, D., Bougares, F., Schwenk, H., and Bengio, Y. (arXiv, 2014). Learning phrase representations using rnn encoder-decoder for statistical machine translation, arXiv.","DOI":"10.3115\/v1\/D14-1179"},{"key":"ref_5","unstructured":"Krizhevsky, A.I.S.A., and Hinton, G.E. (2012, January 3\u20136). Imagenet classification with deep convolutional neural networks. Proceedings of the Advances in Neural Information Processing Systems, Lake Tahoe, NV, USA."},{"key":"ref_6","doi-asserted-by":"crossref","unstructured":"Kalchbrenner, N., Grefenstette, E., and Blunsom, P. (arXiv, 2014). A convolutional neural network for modelling sentences, arXiv.","DOI":"10.3115\/v1\/P14-1062"},{"key":"ref_7","unstructured":"Mikolov, T., Sutskever, I., Chen, K., Corrado, G.S., and Dean, J. (2013, January 5\u201310). Distributed representations of words and phrases and their compositionality. Proceedings of the Advances in Neural Information Processing Systems, Lake Tahoe, NV, USA."},{"key":"ref_8","unstructured":"Le, Q., and Mikolov, T. (2014, January 21\u201326). Distributed representations of sentences and documents. Proceedings of the 31st International Conference on Machine Learning (ICML-14), Beijing, China."},{"key":"ref_9","doi-asserted-by":"crossref","first-page":"2278","DOI":"10.1109\/5.726791","article-title":"Gradient-based learning applied to document recognition","volume":"86","author":"LeCun","year":"1998","journal-title":"Proc. IEEE"},{"key":"ref_10","doi-asserted-by":"crossref","unstructured":"Kim, Y. (arXiv, 2014). Convolutional neural networks for sentence classification, arXiv.","DOI":"10.3115\/v1\/D14-1181"},{"key":"ref_11","doi-asserted-by":"crossref","unstructured":"Conneau, A., Schwenk, H., Barrault, L., and Lecun, Y. (arXiv, 2017). Very deep convolutional networks for text classification, arXiv.","DOI":"10.18653\/v1\/E17-1104"},{"key":"ref_12","doi-asserted-by":"crossref","unstructured":"Tai, K.S., Socher, R., and Manning, C.D. (arXiv, 2015). Improved semantic representations from tree-structured long short-term memory networks, arXiv.","DOI":"10.3115\/v1\/P15-1150"},{"key":"ref_13","doi-asserted-by":"crossref","unstructured":"Zhou, P., Shi, W., Tian, J., Qi, Z., Li, B., Hao, H., and Xu, B. (2016, January 7\u201312). Attention-based bidirectional long short-term memory networks for relation classification. Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, Berlin, Germany.","DOI":"10.18653\/v1\/P16-2034"},{"key":"ref_14","doi-asserted-by":"crossref","unstructured":"Zhang, H.-P., Yu, H.-K., Xiong, D.-Y., and Liu, Q. (2003, January 11\u201312). Hhmm-based chinese lexical analyzer ictclas. Proceedings of the Second SIGHAN Workshop on Chinese Language Processing, SIGHAN \u201903, Sapporo, Japan.","DOI":"10.3115\/1119250.1119280"},{"key":"ref_15","unstructured":"Zhou, C., Sun, C., Liu, Z., and Lau, F. (arXiv, 2015). A c-lstm neural network for text classification, arXiv."},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"516","DOI":"10.1016\/j.ins.2017.09.010","article-title":"A novel multi-modality image fusion method based on image decomposition and sparse representation","volume":"432","author":"Zhu","year":"2018","journal-title":"Inf. Sci."},{"key":"ref_17","doi-asserted-by":"crossref","first-page":"1641","DOI":"10.1109\/29.46546","article-title":"Speaker-independent phone recognition using hidden Markov models","volume":"37","author":"Lee","year":"1989","journal-title":"IEEE Trans. Acoust. Speech Signal Process."},{"key":"ref_18","unstructured":"Socher, R., Huval, B., Manning, C.D., and Ng, A.Y. (2012, January 12\u201314). Semantic compositionality through recursive matrixvector spaces. Proceedings of the Empirical Methods on Natural Language Processing, Jeju Island, Korea."},{"key":"ref_19","unstructured":"Irsoy, O., and Cardie, C. (2014, January 8\u201313). Deep recursive neural networks for compositionality in language. Proceedings of the Advances in Neural Information Processing Systems, Montreal, QC, Canada."},{"key":"ref_20","doi-asserted-by":"crossref","unstructured":"Lei, T., Barzilay, R., and Jaakkola, T. (arXiv, 2015). Molding cnns for text: Non-linear, nonconsecutive convolutions, arXiv.","DOI":"10.18653\/v1\/D15-1180"},{"key":"ref_21","unstructured":"Zhu, X., Sobhani, P., and Guo, H. (2015, January 6\u201311). Long short-term memory over recursive structures. Proceedings of the 32nd International Conference on Machine Learning, Lille, France."}],"container-title":["Future Internet"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1999-5903\/10\/11\/113\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2026,4,4]],"date-time":"2026-04-04T04:34:50Z","timestamp":1775277290000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1999-5903\/10\/11\/113"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2018,11,20]]},"references-count":21,"journal-issue":{"issue":"11","published-online":{"date-parts":[[2018,11]]}},"alternative-id":["fi10110113"],"URL":"https:\/\/doi.org\/10.3390\/fi10110113","relation":{},"ISSN":["1999-5903"],"issn-type":[{"value":"1999-5903","type":"electronic"}],"subject":[],"published":{"date-parts":[[2018,11,20]]}}}