{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,18]],"date-time":"2026-03-18T01:06:07Z","timestamp":1773795967844,"version":"3.50.1"},"reference-count":42,"publisher":"MDPI AG","issue":"8","license":[{"start":{"date-parts":[[2022,7,29]],"date-time":"2022-07-29T00:00:00Z","timestamp":1659052800000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"name":"Major Projects of National Social Science Fund of China","award":["20AZD114"],"award-info":[{"award-number":["20AZD114"]}]},{"name":"Major Projects of National Social Science Fund of China","award":["CCF-NSFOCUS 2020011"],"award-info":[{"award-number":["CCF-NSFOCUS 2020011"]}]},{"name":"Major Projects of National Social Science Fund of China","award":["2020SYS08"],"award-info":[{"award-number":["2020SYS08"]}]},{"name":"\u201cKunpeng\u201d Research Fund Project of CCF-Green Alliance Technology","award":["20AZD114"],"award-info":[{"award-number":["20AZD114"]}]},{"name":"\u201cKunpeng\u201d Research Fund Project of CCF-Green Alliance Technology","award":["CCF-NSFOCUS 2020011"],"award-info":[{"award-number":["CCF-NSFOCUS 2020011"]}]},{"name":"\u201cKunpeng\u201d Research Fund Project of CCF-Green Alliance Technology","award":["2020SYS08"],"award-info":[{"award-number":["2020SYS08"]}]},{"name":"Public Safety Behavioral Sciences Laboratory Open Subject Fund Program of PPSUC","award":["20AZD114"],"award-info":[{"award-number":["20AZD114"]}]},{"name":"Public Safety Behavioral Sciences Laboratory Open Subject Fund Program of PPSUC","award":["CCF-NSFOCUS 2020011"],"award-info":[{"award-number":["CCF-NSFOCUS 2020011"]}]},{"name":"Public Safety Behavioral Sciences Laboratory Open Subject Fund Program of PPSUC","award":["2020SYS08"],"award-info":[{"award-number":["2020SYS08"]}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Future Internet"],"abstract":"<jats:p>To address the shortcomings of existing deep learning models and the characteristics of microblog speech, we propose the DCCMM model to improve the effectiveness of microblog sentiment analysis. The model employs WOBERT Plus and ALBERT to dynamically encode character-level text and word-level text, respectively. Then, a convolution operation is used to extract local key features, while cross-channel feature fusion and multi-head self-attention pooling operations are used to extract global semantic information and filter out key data, before using the multi-granularity feature interaction fusion operation to effectively fuse character-level and word-level semantic information. Finally, the Softmax function is used to output the results. On the weibo_senti_100k dataset, the accuracy and F1 values of the DCCMM model improve by 0.84% and 1.01%, respectively, compared to the best-performing comparison model. On the SMP2020-EWECT dataset, the accuracy and F1 values of the DCCMM model improve by 1.22% and 1.80%, respectively, compared with the experimental results of the best-performing comparison model. The results showed that DCCMM outperforms existing advanced sentiment analysis models.<\/jats:p>","DOI":"10.3390\/fi14080234","type":"journal-article","created":{"date-parts":[[2022,7,31]],"date-time":"2022-07-31T23:37:29Z","timestamp":1659310649000},"page":"234","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":10,"title":["Microblog Sentiment Analysis Based on Dynamic Character-Level and Word-Level Features and Multi-Head Self-Attention Pooling"],"prefix":"10.3390","volume":"14","author":[{"given":"Shangyi","family":"Yan","sequence":"first","affiliation":[{"name":"College of Information and Cyber Security, People\u2019s Public Security University of China, Beijing 100038, China"}]},{"given":"Jingya","family":"Wang","sequence":"additional","affiliation":[{"name":"College of Information and Cyber Security, People\u2019s Public Security University of China, Beijing 100038, China"}]},{"given":"Zhiqiang","family":"Song","sequence":"additional","affiliation":[{"name":"College of Information and Cyber Security, People\u2019s Public Security University of China, Beijing 100038, China"}]}],"member":"1968","published-online":{"date-parts":[[2022,7,29]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","unstructured":"Gu, M., Guo, H., Zhuang, J., Du, Y., and Qian, L. (2022). Social Media User Behavior and Emotions during Crisis Events. Int. J. Environ. Res. Public Health, 19.","DOI":"10.3390\/ijerph19095197"},{"key":"ref_2","doi-asserted-by":"crossref","unstructured":"Alsini, A., Huynh, D.Q., and Datta, A. (2021). Hashtag Recommendation Methods for Twitter and Sina Weibo: A Review. Future Internet, 13.","DOI":"10.3390\/fi13050129"},{"key":"ref_3","doi-asserted-by":"crossref","unstructured":"Li, H., Ma, Y., Ma, Z., and Zhu, H. (2021). Weibo Text Sentiment Analysis Based on BERT and Deep Learning. Appl. Sci., 11.","DOI":"10.3390\/app112210774"},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"5536560","DOI":"10.1155\/2021\/5536560","article-title":"Evaluation of sentiment analysis via word embedding and RNN variants for Amazon online reviews","volume":"2021","author":"Alharbi","year":"2021","journal-title":"Math. Probl. Eng."},{"key":"ref_5","doi-asserted-by":"crossref","first-page":"387","DOI":"10.1177\/0165551520910032","article-title":"Semisupervised sentiment analysis method for online text reviews","volume":"47","author":"Lee","year":"2021","journal-title":"J. Inf. Sci."},{"key":"ref_6","doi-asserted-by":"crossref","unstructured":"Jamal, N., Xianqiao, C., and Aldabbas, H. (2019). Deep Learning-Based Sentimental Analysis for Large-Scale Imbalanced Twitter Data. Future Internet, 11.","DOI":"10.3390\/fi11090190"},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"012057","DOI":"10.1088\/1742-6596\/1229\/1\/012057","article-title":"An efficient character-level and word-level feature fusion method for Chinese text classification","volume":"1229","author":"Wenzhen","year":"2019","journal-title":"J. Phys. Conf. Ser."},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"4105","DOI":"10.3233\/JIFS-212495","article-title":"Research on named entity recognition of chinese electronic medical records based on multi-head attention mechanism and character-word information fusion","volume":"42","author":"Zhang","year":"2022","journal-title":"J. Intell. Fuzzy Syst."},{"key":"ref_9","doi-asserted-by":"crossref","unstructured":"Hu, C., Zhang, S., Gu, T., Yan, Z., and Jiang, J. (2022). Multi-Task Joint Learning Model for Chinese Word Segmentation and Syndrome Differentiation in Traditional Chinese Medicine. Int. J. Environ. Res. Public Health, 19.","DOI":"10.3390\/ijerph19095601"},{"key":"ref_10","first-page":"8405623","article-title":"Sentiment Analysis of Student Texts Using the CNN-BiGRU-AT Model","volume":"2021","author":"Yan","year":"2021","journal-title":"Sci. Program."},{"key":"ref_11","doi-asserted-by":"crossref","first-page":"107456","DOI":"10.1016\/j.knosys.2021.107456","article-title":"MC-Net: Multiple max-pooling integration module and cross multi-scale deconvolution network","volume":"231","author":"You","year":"2021","journal-title":"Knowl.-Based Syst."},{"key":"ref_12","doi-asserted-by":"crossref","unstructured":"Yang, P., Zhou, H., Zhu, Y., Liu, L., and Zhang, L. (2020). Malware Classification Based on Shallow Neural Network. Future Internet, 12.","DOI":"10.3390\/fi12120219"},{"key":"ref_13","doi-asserted-by":"crossref","unstructured":"Tong, X., Wang, J., Jiao, K., Wang, R., and Pan, X. (2020, January 6). Robustness Detection Method of Chinese Spam Based on the Features of Joint Characters-Words. Proceedings of the International Conference on Computer Engineering and Networks, Singapore.","DOI":"10.1007\/978-981-15-8462-6_97"},{"key":"ref_14","doi-asserted-by":"crossref","unstructured":"Chen, W., Fan, C., Wu, Y., and Lou, Z. (2020, January 3\u20135). A Chinese Character-Level and Word-Level Complementary Text Classification Method. Proceedings of the 2020 International Conference on Technologies and Applications of Artificial Intelligence (TAAI), Taipei, Taiwan.","DOI":"10.1109\/TAAI51410.2020.00042"},{"key":"ref_15","unstructured":"Mikolov, T., Sutskever, I., Chen, K., Corrado, G.S., and Dean, J. (2013, January 5\u201310). Distributed representations of words and phrases and their compositionality. Proceedings of the Advances in Neural Information Processing Systems, Lake Tahoe, NV, USA."},{"key":"ref_16","unstructured":"Devlin, J., Chang, M.W., Lee, K., and Toutanova, K. (2018). Bert: Pre-Training of deep bidirectional transformers for language understanding. arXiv."},{"key":"ref_17","unstructured":"Matthew, E.P., Mark, N., Mohit, I., Matt, G., Christopher, C., Kenton, L., and Luke, Z. (2018). Deep Contextualized Word Representations. arXiv."},{"key":"ref_18","unstructured":"Radford, A., Narasimhan, K., Salimans, T., and Sutskever, I. (2022, June 18). Improving Language Understanding by Generative, Pre-Training. Available online: https:\/\/www.cs.ubc.ca\/~amuham01\/LING530\/papers\/radford2018improving.pdf."},{"key":"ref_19","unstructured":"Lan, Z., Chen, M., Goodman, S., Gimpel, K., Sharma, P., and Soricut, R. (2019). Albert: A lite bert for self-supervised learning of language representations. arXiv."},{"key":"ref_20","unstructured":"Liu, Y., Ott, M., Goyal, N., Du, J., Joshi, M., Chen, D., and Stoyanov, V. (2019). Roberta: A robustly optimized bert pretraining approach. arXiv."},{"key":"ref_21","doi-asserted-by":"crossref","first-page":"3504","DOI":"10.1109\/TASLP.2021.3124365","article-title":"Pre-Training with whole word masking for chinese bert","volume":"29","author":"Cui","year":"2021","journal-title":"IEEE\/ACM Trans. Audio Speech Lang. Processing"},{"key":"ref_22","doi-asserted-by":"crossref","first-page":"64","DOI":"10.1162\/tacl_a_00300","article-title":"Spanbert: Improving pre-Training by representing and predicting spans","volume":"8","author":"Joshi","year":"2020","journal-title":"Trans. Assoc. Comput. Linguist."},{"key":"ref_23","unstructured":"Su, J. (2022, June 18). Speed Up without Losing Points: Chinese WoBERT Based on Word Granularity. Available online: https:\/\/kexue.fm\/archives\/7758."},{"key":"ref_24","unstructured":"(2022, June 18). ZhuiyiTechnolog: Chinese BERT with Word as Basic Unit. Available online: https:\/\/github.com\/ZhuiyiTechnology\/WoBERT."},{"key":"ref_25","doi-asserted-by":"crossref","unstructured":"Chen, S., Zhang, H., and Lei, Z. (2021). Person Re-Identification Based on Attention Mechanism and Context Information Fusion. Future Internet, 13.","DOI":"10.3390\/fi13030072"},{"key":"ref_26","doi-asserted-by":"crossref","first-page":"4117","DOI":"10.1007\/s12652-020-01791-9","article-title":"Sentiment analysis of student feedback using multi-Head attention fusion model of word and context embedding for LSTM","volume":"12","author":"Sangeetha","year":"2021","journal-title":"J. Ambient. Intell. Humaniz. Computing"},{"key":"ref_27","doi-asserted-by":"crossref","unstructured":"India, M., Safari, P., and Hernando, J. (2019). Self multi-Head attention for speaker recognition. arXiv.","DOI":"10.21437\/Interspeech.2019-2616"},{"key":"ref_28","doi-asserted-by":"crossref","unstructured":"Fang, Y., Gao, J., Huang, C., Peng, H., and Wu, R. (2019). Self multi-Head attention-based convolutional neural networks for fake news detection. PLoS ONE, 14.","DOI":"10.1371\/journal.pone.0222713"},{"key":"ref_29","unstructured":"Yao, L., Mao, C., and Luo, Y. (February, January 27). Graph convolutional networks for text classification. Proceedings of the AAAI Conference on Artificial Intelligence, Honolulu, HI, USA."},{"key":"ref_30","doi-asserted-by":"crossref","unstructured":"Chi, X., and Xiang, Y. (2021). Augmenting paraphrase generation with syntax information using graph convolutional networks. Entropy, 23.","DOI":"10.20944\/preprints202103.0754.v1"},{"key":"ref_31","first-page":"84","article-title":"A Comparative Study of Graph Concolutional Networks and Self-Attention Mechanism on Text Classification","volume":"35","author":"Jiang","year":"2021","journal-title":"J. Chin. Inf. Processing"},{"key":"ref_32","doi-asserted-by":"crossref","unstructured":"Kim, Y. (2014, January 25\u201329). Convolutional neural networks for sentence classification. Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing, Doha, Qatar.","DOI":"10.3115\/v1\/D14-1181"},{"key":"ref_33","doi-asserted-by":"crossref","unstructured":"Cho, K., Van Merri\u00ebnboer, B., Gulcehre, C., Bahdanau, D., Bougares, F., Schwenk, H., and Bengio, Y. (2014). Learning phrase representations using RNN encoder-decoder for statistical machine translation. arXiv.","DOI":"10.3115\/v1\/D14-1179"},{"key":"ref_34","doi-asserted-by":"crossref","unstructured":"Zhou, P., Shi, W., Tian, J., Qi, Z., Li, B., Hao, H., and Xu, B. (2016, January 7\u201312). Attention-Based bidirectional long short-term memory networks for relation classification. Proceedings of the 54th Annual Meeting of the Association for Computational Linguistics, Berlin, Germany.","DOI":"10.18653\/v1\/P16-2034"},{"key":"ref_35","doi-asserted-by":"crossref","unstructured":"Lai, S., Xu, L., Liu, K., and Zhao, J. (2015, January 25). Recurrent convolutional neural networks for text classification. Proceedings of the AAAI Conference on Artificial Intelligence, Austin, TX, USA.","DOI":"10.1609\/aaai.v29i1.9513"},{"key":"ref_36","doi-asserted-by":"crossref","unstructured":"Song, G., and Huang, D. (2021). A sentiment-Aware contextual model for real-time disaster prediction using Twitter data. Future Internet, 13.","DOI":"10.3390\/fi13070163"},{"key":"ref_37","first-page":"156","article-title":"Microblog Sentiment Analysis Based on BERT and Hierarchical Attention","volume":"58","author":"Zhao","year":"2022","journal-title":"Comput. Eng. Appl."},{"key":"ref_38","doi-asserted-by":"crossref","unstructured":"Peng, S., Zeng, R., Liu, H., Chen, G., Wu, R., Yang, A., and Yu, S. (2021, January 23\u201325). Emotion Classification of Text Based on BERT and Broad Learning System. Proceedings of the Asia-Pacific Web (APWeb) and Web-Age Information Management (WAIM) Joint International Conference on Web and Big Data, Guangzhou, China.","DOI":"10.1007\/978-3-030-85896-4_30"},{"key":"ref_39","doi-asserted-by":"crossref","first-page":"1240","DOI":"10.1109\/TII.2021.3085663","article-title":"A Sentiment Classification Method of Web Social Media Based on Multidimensional and Multilevel Modeling","volume":"18","author":"Wang","year":"2022","journal-title":"IEEE Trans. Ind. Inform."},{"key":"ref_40","doi-asserted-by":"crossref","unstructured":"Mu, Z., Zheng, S., and Wang, Q. (2021, January 12\u201314). ACL-RoBERTa-CNN Text Classification Model Combined with Contrastive Learning. Proceedings of the 2021 International Conference on Big Data Engineering and Education (BDEE), Guiyang, China.","DOI":"10.1109\/BDEE52938.2021.00041"},{"key":"ref_41","first-page":"8865983","article-title":"Chinese Microblog Sentiment Detection Based on CNN-BiGRU and Multihead Attention Mechanism","volume":"2020","author":"Qiu","year":"2020","journal-title":"Sci. Program."},{"key":"ref_42","doi-asserted-by":"crossref","unstructured":"Tang, F., and Nongpong, K. (2021, January 21\u201324). Chinese sentiment analysis based on lightweight character-level bert. Proceedings of the 2021 13th International Conference on Knowledge and Smart Technology (KST), Bangsaen, Thailand.","DOI":"10.1109\/KST51265.2021.9415790"}],"container-title":["Future Internet"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/1999-5903\/14\/8\/234\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,10]],"date-time":"2025-10-10T23:59:06Z","timestamp":1760140746000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/1999-5903\/14\/8\/234"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2022,7,29]]},"references-count":42,"journal-issue":{"issue":"8","published-online":{"date-parts":[[2022,8]]}},"alternative-id":["fi14080234"],"URL":"https:\/\/doi.org\/10.3390\/fi14080234","relation":{},"ISSN":["1999-5903"],"issn-type":[{"value":"1999-5903","type":"electronic"}],"subject":[],"published":{"date-parts":[[2022,7,29]]}}}