{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,1,25]],"date-time":"2026-01-25T13:03:03Z","timestamp":1769346183520,"version":"3.49.0"},"reference-count":24,"publisher":"MDPI AG","issue":"12","license":[{"start":{"date-parts":[[2018,11,24]],"date-time":"2018-11-24T00:00:00Z","timestamp":1543017600000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"name":"The National Key Research and Development Plan of China","award":["2017YFD0400101"],"award-info":[{"award-number":["2017YFD0400101"]}]},{"name":"Zhejiang Province medical and health science and technology platform project","award":["2017KY497"],"award-info":[{"award-number":["2017KY497"]}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Future Internet"],"abstract":"<jats:p>The prevalence that people share their opinions on the products and services in their daily lives on the Internet has generated a large quantity of comment data, which contain great business value. As for comment sentences, they often contain several comment aspects and the sentiment on these aspects are different, which makes it meaningless to give an overall sentiment polarity of the sentence. In this paper, we introduce Attention-based Aspect-level Recurrent Convolutional Neural Network (AARCNN) to analyze the remarks at aspect-level. The model integrates attention mechanism and target information analysis, which enables the model to concentrate on the important parts of the sentence and to make full use of the target information. The model uses bidirectional LSTM (Bi-LSTM) to build the memory of the sentence, and then CNN is applied to extracting attention from memory to get the attentive sentence representation. The model uses aspect embedding to analyze the target information of the representation and finally the model outputs the sentiment polarity through a softmax layer. The model was tested on multi-language datasets, and demonstrated that it has better performance than conventional deep learning methods.<\/jats:p>","DOI":"10.3390\/fi10120116","type":"journal-article","created":{"date-parts":[[2018,11,26]],"date-time":"2018-11-26T03:24:27Z","timestamp":1543202667000},"page":"116","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":35,"title":["A Bi-Directional LSTM-CNN Model with Attention for Aspect-Level Text Classification"],"prefix":"10.3390","volume":"10","author":[{"given":"Yonghua","family":"Zhu","sequence":"first","affiliation":[{"name":"Shanghai Film Academy, Shanghai University, Shanghai 200444, China"}]},{"given":"Xun","family":"Gao","sequence":"additional","affiliation":[{"name":"School of Computer Engineering and Science, Shanghai University, Shanghai 200444, China"}]},{"given":"Weilin","family":"Zhang","sequence":"additional","affiliation":[{"name":"School of Computer Engineering and Science, Shanghai University, Shanghai 200444, China"}]},{"given":"Shenkai","family":"Liu","sequence":"additional","affiliation":[{"name":"Shanghai Film Academy, Shanghai University, Shanghai 200444, China"}]},{"given":"Yuanyuan","family":"Zhang","sequence":"additional","affiliation":[{"name":"College of Information Technology, Zhejiang Chinese Medical University, Hangzhou 310053, China"}]}],"member":"1968","published-online":{"date-parts":[[2018,11,24]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","unstructured":"Pontiki, M., Galanis, D., Papageorgiou, H., Androutsopoulos, I., Manandhar, S., Al-Smadi, M., Al-Ayyoub, M., Zhao, Y., Qin, B., and Clercq, O.D. (2016, January 16\u201317). Semeval-2016 task 5: Aspect based sentiment analysis. Proceedings of the International Workshop on Semantic Evaluation, San Diego, CA, USA.","DOI":"10.18653\/v1\/S16-1002"},{"key":"ref_2","doi-asserted-by":"crossref","unstructured":"Ma, D., Li, S., Zhang, X., and Wang, H. (2017, January 19\u201325). Interactive attention networks for aspect-level sentiment classification. Proceedings of the International Joint Conference on Artificial Intelligence, Melbourne, Australia.","DOI":"10.24963\/ijcai.2017\/568"},{"key":"ref_3","doi-asserted-by":"crossref","unstructured":"Singh, V.K., Piryani, R., Uddin, A., and Waila, P. (2013, January 22\u201323). Sentiment analysis of movie reviews: A new feature-based heuristic for aspect-level sentiment classification. Proceedings of the International Multi-Conference on Automation, Computing, Communication, Control and Compressed Sensing, Kottayam, India.","DOI":"10.1109\/iMac4s.2013.6526500"},{"key":"ref_4","unstructured":"Joachims, T. (1999, January 27\u201330). Transductive inference for text classification using support vector machines. Proceedings of the Sixteenth International Conference on Machine Learning, Bled, Slovenia."},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"103","DOI":"10.1023\/A:1007692713085","article-title":"Text classification from labeled and unlabeled documents using EM","volume":"39","author":"Mccallum","year":"2000","journal-title":"Mach. Learn."},{"key":"ref_6","unstructured":"Jiang, L., Yu, M., Zhou, M., Liu, X., and Zhao, T. (2011, January 19\u201324). Target-dependent twitter sentiment classification. Proceedings of the Annual Meeting of the Association for Computational Linguistics Human Language Technologies, Portland, OR, USA."},{"key":"ref_7","doi-asserted-by":"crossref","unstructured":"Mikolov, T., Karafi\u00e1t, M., Burget, L., Cernock\u00fd, J., and Khudanpur, S. (2010, January 26\u201330). Recurrent neural network based language model. Proceedings of the INTERSPEECH 2010, 11th Annual Conference of the International Speech Communication Association, Chiba, Japan.","DOI":"10.21437\/Interspeech.2010-343"},{"key":"ref_8","doi-asserted-by":"crossref","unstructured":"Dong, L., Wei, F., Tan, C., Tang, D., Zhou, M., and Xu, K. (2014, January 22\u201327). Adaptive recursive neural network for target-dependent twitter sentiment classification. Proceedings of the Meeting of the Association for Computational Linguistics, Baltimore, MD, USA.","DOI":"10.3115\/v1\/P14-2009"},{"key":"ref_9","doi-asserted-by":"crossref","unstructured":"Nguyen, T.H., and Shirai, K. (2015, January 17\u201321). Phrasernn: Phrase recursive neural network for aspect-based sentiment analysis. Proceedings of the Conference on Empirical Methods in Natural Language Processing, Lisbon, Portugal.","DOI":"10.18653\/v1\/D15-1298"},{"key":"ref_10","unstructured":"Tang, D., Qin, B., Feng, X., and Liu, T. (arXiv, 2015). Target-dependent sentiment classification with long short term memory, arXiv."},{"key":"ref_11","doi-asserted-by":"crossref","unstructured":"Wang, Y., Huang, M., Zhu, X., and Zhao, L. (2016, January 1\u20134). Attention-based LSTM for aspect-level sentiment classification. Proceedings of the Conference on Empirical Methods in Natural Language Processing, Austin, TX, USA.","DOI":"10.18653\/v1\/D16-1058"},{"key":"ref_12","doi-asserted-by":"crossref","unstructured":"Tang, D., Qin, B., and Liu, T. (2016, January 1\u20134). Aspect level sentiment classification with deep memory network. Proceedings of the Conference on Empirical Methods in Natural Language Processing, Austin, TX, USA.","DOI":"10.18653\/v1\/D16-1021"},{"key":"ref_13","unstructured":"Du, J., Gui, L., Xu, R., and He, Y. (2017, January 8\u201312). A convolutional attention model for text classification. Proceedings of the National CCF Conference on Natural Language Processing and Chinese Computing, Dalian, China."},{"key":"ref_14","doi-asserted-by":"crossref","unstructured":"Pennington, J., Socher, R., and Manning, C. (2014, January 25\u201329). Glove: Global vectors for word representation. Proceedings of the Conference on Empirical Methods in Natural Language Processing, Doha, Qatar.","DOI":"10.3115\/v1\/D14-1162"},{"key":"ref_15","unstructured":"Mikolov, T., Chen, K., Corrado, G., and Dean, J. (arXiv, 2013). Efficient estimation of word representations in vector space, arXiv."},{"key":"ref_16","doi-asserted-by":"crossref","first-page":"1735","DOI":"10.1162\/neco.1997.9.8.1735","article-title":"Long short-term memory","volume":"9","author":"Hochreiter","year":"1997","journal-title":"Neural Comput."},{"key":"ref_17","doi-asserted-by":"crossref","first-page":"181","DOI":"10.1002\/jmri.24365","article-title":"Fast image reconstruction with l2-regularization","volume":"40","author":"Bilgic","year":"2014","journal-title":"J. Magn. Reson. Imaging"},{"key":"ref_18","unstructured":"Krizhevsky, A., Sutskever, I., and Hinton, G.E. (2012, January 3\u20136). Imagenet classification with deep convolutional neural networks. Proceedings of the International Conference on Neural Information Processing Systems, Lake Tahoe, NV, USA."},{"key":"ref_19","first-page":"1929","article-title":"Dropout: A simple way to prevent neural networks from overfitting","volume":"15","author":"Srivastava","year":"2014","journal-title":"J. Mach. Learn. Res."},{"key":"ref_20","unstructured":"Goldhahn, D., Eckart, T., and Quasthoff, U. (2012, January 23\u201325). Building large monolingual dictionaries at the Leipzig corpora collection: From 100 to 200 languages. Proceedings of the 8th International Language Ressources and Evaluation (LREC\u201912), Istanbul, Turkey."},{"key":"ref_21","unstructured":"Tseng, H., Chang, P., Andrew, G., Jurafsky, D., and Manning, C. (2005). A conditional random field word segmenter. Found. Sci., 168\u2013171."},{"key":"ref_22","doi-asserted-by":"crossref","unstructured":"Wang, X., Liu, Y., Sun, C., Wang, B., and Wang, X. (2015, January 26\u201331). Predicting polarities of tweets by composing word embeddings with long short-term memory. Proceedings of the Meeting of the Association for Computational Linguistics and the International Joint Conference on Natural Language Processing, Beijing, China.","DOI":"10.3115\/v1\/P15-1130"},{"key":"ref_23","doi-asserted-by":"crossref","first-page":"259","DOI":"10.1162\/tacl_a_00097","article-title":"Abcnn: Attention-based convolutional neural network for modeling sentence pairs","volume":"4","author":"Yin","year":"2016","journal-title":"Trans. Assoc. Comput. Linguist."},{"key":"ref_24","doi-asserted-by":"crossref","unstructured":"Wang, J., Yu, L.C., Lai, K.R., and Zhang, X. (2016, January 7\u201312). Dimensional sentiment analysis using a regional CNN-LSTM model. Proceedings of the Meeting of the Association for Computational Linguistics, Berlin, Germany.","DOI":"10.18653\/v1\/P16-2037"}],"container-title":["Future Internet"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1999-5903\/10\/12\/116\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T15:31:53Z","timestamp":1760196713000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1999-5903\/10\/12\/116"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2018,11,24]]},"references-count":24,"journal-issue":{"issue":"12","published-online":{"date-parts":[[2018,12]]}},"alternative-id":["fi10120116"],"URL":"https:\/\/doi.org\/10.3390\/fi10120116","relation":{},"ISSN":["1999-5903"],"issn-type":[{"value":"1999-5903","type":"electronic"}],"subject":[],"published":{"date-parts":[[2018,11,24]]}}}