{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,2,21]],"date-time":"2026-02-21T18:42:44Z","timestamp":1771699364431,"version":"3.50.1"},"reference-count":62,"publisher":"Springer Science and Business Media LLC","issue":"1","license":[{"start":{"date-parts":[[2023,10,23]],"date-time":"2023-10-23T00:00:00Z","timestamp":1698019200000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"},{"start":{"date-parts":[[2023,10,23]],"date-time":"2023-10-23T00:00:00Z","timestamp":1698019200000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"}],"funder":[{"name":"Sichuan Science and Technology Program","award":["2021YFQ0003"],"award-info":[{"award-number":["2021YFQ0003"]}]},{"name":"Sichuan Science and Technology Program","award":["2023YFSY0026"],"award-info":[{"award-number":["2023YFSY0026"]}]},{"name":"Sichuan Science and Technology Program","award":["2023YFH0004"],"award-info":[{"award-number":["2023YFH0004"]}]}],"content-domain":{"domain":["link.springer.com"],"crossmark-restriction":false},"short-container-title":["Int J Comput Intell Syst"],"abstract":"<jats:title>Abstract<\/jats:title><jats:p>Natural language processing (NLP) based on deep learning provides a positive performance for generative dialogue system, and the transformer model is a new boost in NLP after the advent of word vectors. In this paper, a Chinese generative dialogue system based on transformer is designed, which only uses a multi-layer transformer decoder to build the system and uses the design of an incomplete mask to realize one-way language generation. That is, questions can perceive context information in both directions, while reply sentences can only output one-way autoregressive. The above system improvements make the one-way generation of dialogue tasks more logical and reasonable, and the performance is better than the traditional dialogue system scheme. In consideration of the long-distance information weakness of absolute position coding, we put forward the improvement of relative position coding in theory, and verify it in subsequent experiments. In the transformer module, the calculation formula of self-attention is modified, and the relative position information is added to replace the absolute position coding of the position embedding layer. The performance of the modified model in BLEU, embedding average, grammatical and semantic coherence is ideal, to enhance long-distance attention.<\/jats:p>","DOI":"10.1007\/s44196-023-00345-z","type":"journal-article","created":{"date-parts":[[2023,10,23]],"date-time":"2023-10-23T11:01:29Z","timestamp":1698058889000},"update-policy":"https:\/\/doi.org\/10.1007\/springer_crossmark_policy","source":"Crossref","is-referenced-by-count":74,"title":["Design of a Modified Transformer Architecture Based on Relative Position Coding"],"prefix":"10.1007","volume":"16","author":[{"ORCID":"https:\/\/orcid.org\/0000-0002-8486-1654","authenticated-orcid":false,"given":"Wenfeng","family":"Zheng","sequence":"first","affiliation":[]},{"given":"Gu","family":"Gong","sequence":"additional","affiliation":[]},{"given":"Jiawei","family":"Tian","sequence":"additional","affiliation":[]},{"given":"Siyu","family":"Lu","sequence":"additional","affiliation":[]},{"given":"Ruiyang","family":"Wang","sequence":"additional","affiliation":[]},{"given":"Zhengtong","family":"Yin","sequence":"additional","affiliation":[]},{"given":"Xiaolu","family":"Li","sequence":"additional","affiliation":[]},{"given":"Lirong","family":"Yin","sequence":"additional","affiliation":[]}],"member":"297","published-online":{"date-parts":[[2023,10,23]]},"reference":[{"issue":"June","key":"345_CR1","doi-asserted-by":"publisher","first-page":"327","DOI":"10.1016\/j.neucom.2020.01.126","volume":"439","author":"L Mateju","year":"2021","unstructured":"Mateju, L., Griol, D., Callejas, Z., Molina, J.M., Sanchis, A.: An empirical assessment of deep learning approaches to task-oriented dialog management. Neurocomputing 439(June), 327\u2013339 (2021). https:\/\/doi.org\/10.1016\/j.neucom.2020.01.126","journal-title":"Neurocomputing"},{"issue":"4","key":"345_CR2","doi-asserted-by":"publisher","first-page":"3055","DOI":"10.1007\/s10462-022-10248-8","volume":"56","author":"J Ni","year":"2023","unstructured":"Ni, J., Young, T., Pandelea, V., Xue, F., Cambria, E.: Recent advances in deep learning based dialogue systems: a systematic survey. Artif. Intell. Rev. 56(4), 3055\u20133155 (2023). https:\/\/doi.org\/10.1007\/s10462-022-10248-8","journal-title":"Artif. Intell. Rev."},{"issue":"January","key":"345_CR3","doi-asserted-by":"publisher","first-page":"443","DOI":"10.1016\/j.neucom.2021.05.103","volume":"470","author":"I Lauriola","year":"2022","unstructured":"Lauriola, I., Lavelli, A., Aiolli, F.: An introduction to deep learning in natural language processing: models, techniques, and tools. Neurocomputing 470(January), 443\u2013456 (2022). https:\/\/doi.org\/10.1016\/j.neucom.2021.05.103","journal-title":"Neurocomputing"},{"key":"345_CR4","doi-asserted-by":"publisher","unstructured":"Zhu X (2022) \u201cRNN Language Processing Model-Driven Spoken Dialogue System Modeling Method.\u201d Edited by Xin Ning. Computational Intelligence and Neuroscience 2022 (February): 1\u20139. https:\/\/doi.org\/10.1155\/2022\/6993515.","DOI":"10.1155\/2022\/6993515"},{"issue":"December","key":"345_CR5","doi-asserted-by":"publisher","DOI":"10.1016\/j.eswa.2022.118277","volume":"209","author":"Y Park","year":"2022","unstructured":"Park, Y., Ko, Y., Seo, J.: BERT-based response selection in dialogue systems using utterance attention mechanisms. Expert Syst. Appl. 209(December), 118277 (2022). https:\/\/doi.org\/10.1016\/j.eswa.2022.118277","journal-title":"Expert Syst. Appl."},{"key":"345_CR6","doi-asserted-by":"publisher","unstructured":"Junaid T, Sumathi D, Sasikumar AN, Suthir S, Manikandan J, Rashmita K, Kuppusamy PG, Janardhana Raju M (2022) A comparative analysis of transformer based models for figurative language classification. Comput Electr Eng 101 (July): 108051. https:\/\/doi.org\/10.1016\/j.compeleceng.2022.108051","DOI":"10.1016\/j.compeleceng.2022.108051"},{"issue":"July","key":"345_CR7","doi-asserted-by":"publisher","DOI":"10.1016\/j.compchemeng.2023.108264","volume":"175","author":"J Li","year":"2023","unstructured":"Li, J., Joe Qin, S.: Applying and dissecting LSTM neural networks and regularized learning for dynamic inferential modeling. Comput. Chem. Eng. 175(July), 108264 (2023). https:\/\/doi.org\/10.1016\/j.compchemeng.2023.108264","journal-title":"Comput. Chem. Eng."},{"issue":"March","key":"345_CR8","doi-asserted-by":"publisher","DOI":"10.1016\/j.physd.2019.132306","volume":"404","author":"A Sherstinsky","year":"2020","unstructured":"Sherstinsky, A.: Fundamentals of recurrent neural network (RNN) and long short-term memory (LSTM) network. Physica D 404(March), 132306 (2020). https:\/\/doi.org\/10.1016\/j.physd.2019.132306","journal-title":"Physica D"},{"issue":"July","key":"345_CR9","doi-asserted-by":"publisher","DOI":"10.1016\/j.asoc.2023.110314","volume":"142","author":"PB Weerakody","year":"2023","unstructured":"Weerakody, P.B., Wong, K.W., Wang, G.: Policy gradient empowered LSTM with dynamic skips for irregular time series data. Appl. Soft Comput. 142(July), 110314 (2023). https:\/\/doi.org\/10.1016\/j.asoc.2023.110314","journal-title":"Appl. Soft Comput."},{"issue":"June","key":"345_CR10","doi-asserted-by":"publisher","first-page":"460","DOI":"10.1016\/j.psep.2023.04.020","volume":"174","author":"X Zhang","year":"2023","unstructured":"Zhang, X., Shi, J., Yang, M., Huang, X., Usmani, A.S., Chen, G., Jianmin, Fu., Huang, J., Li, J.: Real-time pipeline leak detection and localization using an attention-based LSTM approach. Process. Saf. Environ. Prot. 174(June), 460\u2013472 (2023). https:\/\/doi.org\/10.1016\/j.psep.2023.04.020","journal-title":"Process. Saf. Environ. Prot."},{"key":"345_CR11","unstructured":"Sutskever I, Vinyals O (2014) Sequence to sequence learning with neural networks. Adv Neural Inform Process Syst"},{"key":"345_CR12","unstructured":"Bahdanau D, Cho K, Bengio Y (2014) Neural machine translation by jointly learning to align and translate. Computer Science"},{"issue":"8","key":"345_CR13","doi-asserted-by":"publisher","DOI":"10.1088\/1361-6501\/ac632d","volume":"33","author":"J Li","year":"2022","unstructured":"Li, J., Chen, R., Huang, X.: A sequence-to-sequence remaining useful life prediction method combining unsupervised LSTM encoding-decoding and temporal convolutional network. Meas. Sci. Technol. 33(8), 085013 (2022). https:\/\/doi.org\/10.1088\/1361-6501\/ac632d","journal-title":"Meas. Sci. Technol."},{"issue":"October","key":"345_CR14","doi-asserted-by":"publisher","first-page":"432","DOI":"10.1016\/j.neucom.2020.04.137","volume":"410","author":"Z Liang","year":"2020","unstructured":"Liang, Z., Junping, Du., Li, C.: Abstractive social media text summarization using selective reinforced Seq2Seq attention model. Neurocomputing 410(October), 432\u2013440 (2020). https:\/\/doi.org\/10.1016\/j.neucom.2020.04.137","journal-title":"Neurocomputing"},{"key":"345_CR15","doi-asserted-by":"crossref","unstructured":"Britz D, Goldie A, Luong M-T, Quoc L (2017) Massive Exploration of Neural Machine Translation Architectures. arXiv.","DOI":"10.18653\/v1\/D17-1151"},{"key":"345_CR16","unstructured":"Chorowski J, Bahdanau D, Serdyuk D, Cho K, Bengio Y (2015) Attention-Based Models for Speech Recognition. ArXiv.Org. June 24, 2015"},{"issue":"5","key":"345_CR17","doi-asserted-by":"publisher","first-page":"3946","DOI":"10.1109\/TAES.2022.3157660","volume":"58","author":"Y Shen","year":"2022","unstructured":"Shen, Y.: Bionic communication network and binary pigeon-inspired optimization for multiagent cooperative task allocation. IEEE Trans. Aerosp. Electron. Syst. 58(5), 3946\u20133961 (2022). https:\/\/doi.org\/10.1109\/TAES.2022.3157660","journal-title":"IEEE Trans. Aerosp. Electron. Syst."},{"issue":"August","key":"345_CR18","doi-asserted-by":"publisher","DOI":"10.1016\/j.measurement.2022.111594","volume":"199","author":"H Lv","year":"2022","unstructured":"Lv, H., Chen, J., Pan, T., Zhang, T., Feng, Y., Liu, S.: Attention mechanism in intelligent fault diagnosis of machinery: a review of technique and application. Measurement 199(August), 111594 (2022). https:\/\/doi.org\/10.1016\/j.measurement.2022.111594","journal-title":"Measurement"},{"issue":"October","key":"345_CR19","doi-asserted-by":"publisher","DOI":"10.1016\/j.patcog.2022.108837","volume":"130","author":"Q Shi","year":"2022","unstructured":"Shi, Q., Fan, J., Wang, Z., Zhang, Z.: Multimodal channel-wise attention transformer inspired by multisensory integration mechanisms of the brain. Pattern Recogn. 130(October), 108837 (2022). https:\/\/doi.org\/10.1016\/j.patcog.2022.108837","journal-title":"Pattern Recogn."},{"issue":"5","key":"345_CR20","doi-asserted-by":"publisher","first-page":"71","DOI":"10.1145\/3477002","volume":"20","author":"X Zhang","year":"2021","unstructured":"Zhang, X., Yawen, Wu., Zhou, P., Tang, X., Jingtong, Hu.: Algorithm-hardware co-design of attention mechanism on FPGA devices. Acm Trans Embedded Comput Syst 20(5), 71 (2021). https:\/\/doi.org\/10.1145\/3477002","journal-title":"Acm Trans Embedded Comput Syst"},{"issue":"8","key":"345_CR21","doi-asserted-by":"publisher","first-page":"3510","DOI":"10.1109\/TNNLS.2021.3053245","volume":"33","author":"J Ni","year":"2022","unstructured":"Ni, J., Huang, Z., Chang, Yu., Lv, D., Wang, C.: Comparative convolutional dynamic multi-attention recommendation model. Ieee Trans Neural Netw Learn Syst 33(8), 3510\u20133521 (2022). https:\/\/doi.org\/10.1109\/TNNLS.2021.3053245","journal-title":"Ieee Trans Neural Netw Learn Syst"},{"issue":"13","key":"345_CR22","doi-asserted-by":"publisher","first-page":"1721","DOI":"10.1111\/mice.12826","volume":"37","author":"J Chen","year":"2022","unstructured":"Chen, J., He, Ye.: A novel u-shaped encoder\u2013decoder network with attention mechanism for detection and evaluation of road cracks at pixel level. Comput-Aid Civ Infrastruct Eng 37(13), 1721\u20131736 (2022). https:\/\/doi.org\/10.1111\/mice.12826","journal-title":"Comput-Aid Civ Infrastruct Eng"},{"issue":"May","key":"345_CR23","doi-asserted-by":"publisher","first-page":"269","DOI":"10.1016\/j.neucom.2019.12.118","volume":"388","author":"S Du","year":"2020","unstructured":"Du, S., Li, T., Yang, Y., Horng, S.-J.: Multivariate time series forecasting via attention-based encoder\u2013decoder framework. Neurocomputing 388(May), 269\u2013279 (2020). https:\/\/doi.org\/10.1016\/j.neucom.2019.12.118","journal-title":"Neurocomputing"},{"issue":"8","key":"345_CR24","doi-asserted-by":"publisher","first-page":"3306","DOI":"10.1109\/TNNLS.2020.3015929","volume":"32","author":"L Feng","year":"2021","unstructured":"Feng, L., Zhao, C., Sun, Y.: Dual attention-based encoder\u2013decoder: a customized sequence-to-sequence learning for soft sensor development. IEEE Trans Neural Netw Learn Syst 32(8), 3306\u20133317 (2021). https:\/\/doi.org\/10.1109\/TNNLS.2020.3015929","journal-title":"IEEE Trans Neural Netw Learn Syst"},{"key":"345_CR25","unstructured":"Mikolov T (2012) Statistical language models based on neural networks. PhD thesis, Brno University of Technology"},{"issue":"11","key":"345_CR26","doi-asserted-by":"publisher","first-page":"2673","DOI":"10.1109\/78.650093","volume":"45","author":"M Schuster","year":"1997","unstructured":"Schuster, M., Paliwal, K.: Bidirectional recurrent neural networks. IEEE Trans. Signal Process. 45(11), 2673\u20132681 (1997)","journal-title":"IEEE Trans. Signal Process."},{"issue":"3","key":"345_CR27","doi-asserted-by":"publisher","first-page":"517","DOI":"10.1109\/TASLP.2015.2400218","volume":"23","author":"M Sundermeyer","year":"2015","unstructured":"Sundermeyer, M., Schluter, R.: From feedforward to recurrent LSTM neural networks for language modeling. IEEE\/ACM Trans Audio Speech Lang Process 23(3), 517\u2013529 (2015)","journal-title":"IEEE\/ACM Trans Audio Speech Lang Process"},{"issue":"January","key":"345_CR28","doi-asserted-by":"publisher","first-page":"64","DOI":"10.1016\/j.neucom.2019.09.003","volume":"372","author":"S Zhu","year":"2020","unstructured":"Zhu, S., Cheng, X., Sen, Su.: Knowledge-based question answering by tree-to-sequence learning. Neurocomputing 372(January), 64\u201372 (2020). https:\/\/doi.org\/10.1016\/j.neucom.2019.09.003","journal-title":"Neurocomputing"},{"key":"345_CR29","doi-asserted-by":"publisher","unstructured":"Liu T, Wang K, Sha L, Chang B, Sui Z Table-to-text generation by structure-aware Seq2seq learning. proceedings of the AAAI conference on artificial intelligence 32 https:\/\/doi.org\/10.1609\/aaai.v32i1.11925 (2018).","DOI":"10.1609\/aaai.v32i1.11925"},{"key":"345_CR30","unstructured":"Vaswani A, Shazeer N, Parmar N et al. (2017) Attention is all you need. In: Advances in neural information processing systems, pages 5998\u20136008"},{"issue":"September","key":"345_CR31","doi-asserted-by":"publisher","first-page":"48","DOI":"10.1016\/j.neucom.2021.03.091","volume":"452","author":"Z Niu","year":"2021","unstructured":"Niu, Z., Zhong, G., Hui, Yu.: A review on the attention mechanism of deep learning. Neurocomputing 452(September), 48\u201362 (2021). https:\/\/doi.org\/10.1016\/j.neucom.2021.03.091","journal-title":"Neurocomputing"},{"issue":"December","key":"345_CR32","doi-asserted-by":"publisher","first-page":"15","DOI":"10.1016\/j.specom.2020.09.005","volume":"125","author":"He Qun","year":"2020","unstructured":"Qun, He., Wenjing, L., Zhangli, C.: B&Anet: combining bidirectional LSTM and self-attention for end-to-end learning of task-oriented dialogue system. Speech Commun. 125(December), 15\u201323 (2020). https:\/\/doi.org\/10.1016\/j.specom.2020.09.005","journal-title":"Speech Commun."},{"key":"345_CR33","unstructured":"Beltagy Iz, Matthew EP, Cohan A (2020) Longformer: The Long-Document Transformer. arXiv."},{"issue":"October","key":"345_CR34","doi-asserted-by":"publisher","DOI":"10.1016\/j.patcog.2022.108748","volume":"130","author":"W Shan","year":"2022","unstructured":"Shan, W., Huang, D., Wang, J., Zou, F., Li, S.: Self-attention based fine-grained cross-media hybrid network. Pattern Recogn. 130(October), 108748 (2022). https:\/\/doi.org\/10.1016\/j.patcog.2022.108748","journal-title":"Pattern Recogn."},{"issue":"3","key":"345_CR35","doi-asserted-by":"publisher","first-page":"733","DOI":"10.1162\/coli_a_00445","volume":"48","author":"P Dufter","year":"2022","unstructured":"Dufter, P., Schmitt, M., Sch\u00fctze, H.: Position information in transformers: an overview. Comput. Linguist. 48(3), 733\u2013763 (2022). https:\/\/doi.org\/10.1162\/coli_a_00445","journal-title":"Comput. Linguist."},{"key":"345_CR36","doi-asserted-by":"publisher","unstructured":"Yida W, Ke P, Zheng Y, Huang K, Jiang Y, Zhu X, Huang M (2020) A large-scale chinese short-text conversation dataset. In: Paper presented at the Natural Language Processing and Chinese Computing, Cham. https:\/\/doi.org\/10.1007\/978-3-030-60450-9_8","DOI":"10.1007\/978-3-030-60450-9_8"},{"key":"345_CR37","doi-asserted-by":"publisher","first-page":"123","DOI":"10.1007\/s44196-023-00299-2","volume":"16","author":"HI Abdalla","year":"2023","unstructured":"Abdalla, H.I., Amer, A.A., Amer, Y.A., et al.: Boosting the item-based collaborative filtering model with novel similarity measures. Int J Comput Intell Syst 16, 123 (2023). https:\/\/doi.org\/10.1007\/s44196-023-00299-2","journal-title":"Int J Comput Intell Syst"},{"key":"345_CR38","doi-asserted-by":"publisher","unstructured":"Amer AA, Abdalla HI, Nguyen L (2021) Enhancing recommendation systems performance using highly-effective similarity measures. Knowl-Based Syst 217: 106842. https:\/\/doi.org\/10.1016\/j.knosys.2021.106842","DOI":"10.1016\/j.knosys.2021.106842"},{"issue":"October","key":"345_CR39","doi-asserted-by":"publisher","DOI":"10.1016\/j.aei.2021.101396","volume":"50","author":"Z Liu","year":"2021","unstructured":"Liu, Z., Liu, H., Jia, W., Zhang, D., Tan, J.: A multi-head neural network with unsymmetrical constraints for remaining useful life prediction. Adv. Eng. Inform. 50(October), 101396 (2021). https:\/\/doi.org\/10.1016\/j.aei.2021.101396","journal-title":"Adv. Eng. Inform."},{"issue":"September","key":"345_CR40","doi-asserted-by":"publisher","DOI":"10.1016\/j.eswa.2022.117275","volume":"202","author":"S Reza","year":"2022","unstructured":"Reza, S., Ferreira, M.C., Machado, J.J.M., Jo\u00e3o, M.R., Tavares, S.: A multi-head attention-based transformer model for traffic flow forecasting with a comparative analysis to recurrent neural networks. Expert Syst. Appl. 202(September), 117275 (2022). https:\/\/doi.org\/10.1016\/j.eswa.2022.117275","journal-title":"Expert Syst. Appl."},{"key":"345_CR41","doi-asserted-by":"publisher","unstructured":"Zhang L, Wang C-C, Chen X (2022) Predicting Drug-target binding affinity through molecule representation block based on multi-head attention and skip connection. Briefings Bioinform 23(6): bbac468. https:\/\/doi.org\/10.1093\/bib\/bbac468.","DOI":"10.1093\/bib\/bbac468"},{"issue":"April","key":"345_CR42","doi-asserted-by":"publisher","DOI":"10.7717\/peerj-cs.908","volume":"8","author":"W Zheng","year":"2022","unstructured":"Zheng, W., Yin, L.: Characterization inference based on joint-optimization of multi-layer semantics and deep fusion matching network. PeerJ Comput Sci 8(April), e908 (2022). https:\/\/doi.org\/10.7717\/peerj-cs.908","journal-title":"PeerJ Comput Sci"},{"issue":"7","key":"345_CR43","doi-asserted-by":"publisher","first-page":"3416","DOI":"10.3390\/app12073416","volume":"12","author":"W Zheng","year":"2022","unstructured":"Zheng, W., Zhou, Yu., Liu, S., Tian, J., Yang, Bo., Yin, L.: A deep fusion matching network semantic reasoning model. Appl. Sci. 12(7), 3416 (2022). https:\/\/doi.org\/10.3390\/app12073416","journal-title":"Appl. Sci."},{"issue":"3","key":"345_CR44","doi-asserted-by":"publisher","DOI":"10.1371\/journal.pone.0282514","volume":"18","author":"EA Atta","year":"2023","unstructured":"Atta, E.A., Ali, A.F., Elshamy, A.A.: A modified weighted chimp optimization algorithm for training feed-forward neural network Edited by Kathiravan Srinivasan. PLoS ONE 18(3), e0282514 (2023). https:\/\/doi.org\/10.1371\/journal.pone.0282514","journal-title":"PLoS ONE"},{"issue":"March","key":"345_CR45","doi-asserted-by":"publisher","DOI":"10.7717\/peerj-cs.353","volume":"7","author":"Z Ma","year":"2021","unstructured":"Ma, Z., Zheng, W., Chen, X., Yin, L.: Joint embedding VQA model based on dynamic word vector. PeerJ Computer Science 7(March), e353 (2021). https:\/\/doi.org\/10.7717\/peerj-cs.353","journal-title":"PeerJ Computer Science"},{"issue":"March","key":"345_CR46","doi-asserted-by":"publisher","DOI":"10.1155\/2022\/7479110","volume":"2022","author":"Yi Zong","year":"2022","unstructured":"Zong, Yi., Pan, E.: A SOM-based customer stratification model. Wirel. Commun. Mob. Comput. 2022(March), e7479110 (2022). https:\/\/doi.org\/10.1155\/2022\/7479110","journal-title":"Wirel. Commun. Mob. Comput."},{"key":"345_CR47","unstructured":"Gehring J, Auli M, Grangier D, Yarats D, Dauphin YN. (2017) Convolutional Sequence to Sequence Learning. In: Proceedings of the 34th international conference on machine learning, 1243\u201352. PMLR."},{"key":"345_CR48","unstructured":"Liu X, Yu H-F, Dhillon I, Hsieh C-J (2020) Learning to encode position for transformer with continuous dynamical model. In: Proceedings of the 37th international conference on machine learning, 6327\u201335. PMLR."},{"key":"345_CR49","doi-asserted-by":"publisher","unstructured":"Abdalla HI, Amer AA (2022) On the integration of similarity measures with machine learning models to enhance text classification performance. Inform Sci 614: 263\u2013288. https:\/\/doi.org\/10.1016\/j.ins.2022.10.004","DOI":"10.1016\/j.ins.2022.10.004"},{"key":"345_CR50","doi-asserted-by":"publisher","unstructured":"Abdalla HI, Amer AA, Ravana SD (2023) BoW-based neural networks vs. cutting-edge models for single-label text classification. Neural Comput Appl 35(27): 20103\u201320116. https:\/\/doi.org\/10.1007\/s00521-023-08754-z","DOI":"10.1007\/s00521-023-08754-z"},{"key":"345_CR51","doi-asserted-by":"publisher","unstructured":"Shang L, Lu Z, Li H (2015) Neural responding machine for short-text conversation. arXiv. https:\/\/doi.org\/10.48550\/arXiv.1503.02364.","DOI":"10.48550\/arXiv.1503.02364"},{"key":"345_CR52","unstructured":"Vinyals O, Le Q (2015) A neural conversational model. arXiv."},{"key":"345_CR53","doi-asserted-by":"publisher","unstructured":"Papineni K, Roukos S, Ward T, Zhu W-J (2002) Bleu: a method for automatic evaluation of machine translation. In: Proceedings of the 40th annual meeting of the association for computational linguistics, 311\u201318. Philadelphia, Pennsylvania, USA: Association for Computational Linguistics. https:\/\/doi.org\/10.3115\/1073083.1073135.","DOI":"10.3115\/1073083.1073135"},{"key":"345_CR54","doi-asserted-by":"crossref","unstructured":"Corley C, Mihalcea R (2005) Measuring the Semantic Similarity of Texts. In: Proceedings of the ACL workshop on empirical modeling of semantic equivalence and entailment, 13\u201318. Ann Arbor, Michigan: Association for Computational Linguistics.","DOI":"10.3115\/1631862.1631865"},{"key":"345_CR55","unstructured":"Lintean M, Rus V (2012) Measuring semantic similarity in short texts through greedy pairing and word semantics. In: Proceedings of the twenty-fifth international FLAIRS conference, Marco Island, FL, USA, 23\u201325 May"},{"issue":"1","key":"345_CR56","doi-asserted-by":"publisher","first-page":"55","DOI":"10.3390\/knowledge2010004","volume":"2","author":"S Yadav","year":"2022","unstructured":"Yadav, S., Kaushik, A.: Do you ever get off track in a conversation? the conversational system\u2019s anatomy and evaluation metrics. Knowledge 2(1), 55\u201387 (2022). https:\/\/doi.org\/10.3390\/knowledge2010004","journal-title":"Knowledge"},{"key":"345_CR57","unstructured":"Wieting J, Bansal M, Gimpel K, Livescu K (2016) Towards Universal Paraphrastic Sentence Embeddings. arXiv."},{"key":"345_CR58","doi-asserted-by":"publisher","first-page":"2211","DOI":"10.1109\/TASLP.2020.3003864","volume":"28","author":"S-H Zhong","year":"2020","unstructured":"Zhong, S.-H., Liu, P., Ming, Z., Liu, Y.: How to evaluate single-round dialogues like humans: an information-oriented metric. IEEE\/ACM Trans Audio Speech Lang Process 28, 2211\u20132223 (2020). https:\/\/doi.org\/10.1109\/TASLP.2020.3003864","journal-title":"IEEE\/ACM Trans Audio Speech Lang Process"},{"key":"345_CR59","doi-asserted-by":"publisher","first-page":"2502","DOI":"10.1109\/TASLP.2021.3074012","volume":"29","author":"C Zhang","year":"2021","unstructured":"Zhang, C., Lee, G., D\u2019Haro, L.F., Li, H.: D-score: holistic dialogue evaluation without reference. IEEE\/ACM Trans Audio Speech Lang Process 29, 2502\u20132516 (2021). https:\/\/doi.org\/10.1109\/TASLP.2021.3074012","journal-title":"IEEE\/ACM Trans Audio Speech Lang Process"},{"key":"345_CR60","doi-asserted-by":"crossref","unstructured":"Oluwatobi O, Mueller E (2020). DLGNet: A transformer-based model for dialogue response generation. In: Proceedings of the 2nd workshop on natural language processing for conversational AI","DOI":"10.18653\/v1\/2020.nlp4convai-1.7"},{"key":"345_CR61","doi-asserted-by":"crossref","unstructured":"Zhang Y, Sun S, Galley M, Chen Y-C, Brockett C, Gao X, Gao J, Liu J, Dolan B (2019) Dialogpt: Large-scale generative pre-training for conversational response generation. arXiv preprint arXiv:1911.00536.","DOI":"10.18653\/v1\/2020.acl-demos.30"},{"key":"345_CR62","doi-asserted-by":"crossref","unstructured":"Luo J, Zou X, Hou M (2022) A novel character-word fusion chinese named entity recognition model based on attention mechanism. In: 2022 IEEE 5th international conference on computer and communication engineering technology (CCET)","DOI":"10.1109\/CCET55412.2022.9906333"}],"container-title":["International Journal of Computational Intelligence Systems"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s44196-023-00345-z.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/article\/10.1007\/s44196-023-00345-z\/fulltext.html","content-type":"text\/html","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s44196-023-00345-z.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2023,11,18]],"date-time":"2023-11-18T21:35:50Z","timestamp":1700343350000},"score":1,"resource":{"primary":{"URL":"https:\/\/link.springer.com\/10.1007\/s44196-023-00345-z"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2023,10,23]]},"references-count":62,"journal-issue":{"issue":"1","published-online":{"date-parts":[[2023,12]]}},"alternative-id":["345"],"URL":"https:\/\/doi.org\/10.1007\/s44196-023-00345-z","relation":{},"ISSN":["1875-6883"],"issn-type":[{"value":"1875-6883","type":"electronic"}],"subject":[],"published":{"date-parts":[[2023,10,23]]},"assertion":[{"value":"23 May 2023","order":1,"name":"received","label":"Received","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"2 October 2023","order":2,"name":"accepted","label":"Accepted","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"23 October 2023","order":3,"name":"first_online","label":"First Online","group":{"name":"ArticleHistory","label":"Article History"}},{"order":1,"name":"Ethics","group":{"name":"EthicsHeading","label":"Declarations"}},{"value":"The authors declare that they have no relevant financial or non-financial interests to disclose.","order":2,"name":"Ethics","group":{"name":"EthicsHeading","label":"Conflict of interest"}},{"value":"Not applicable.","order":3,"name":"Ethics","group":{"name":"EthicsHeading","label":"Employment"}},{"value":"The Ethic statement is not applicable. This study does not include any animal or human studies.","order":4,"name":"Ethics","group":{"name":"EthicsHeading","label":"Ethical approval"}},{"value":"Not applicable.","order":5,"name":"Ethics","group":{"name":"EthicsHeading","label":"Informed consent statement"}},{"value":"Not applicable.","order":6,"name":"Ethics","group":{"name":"EthicsHeading","label":"Institutional review board statement"}},{"value":"Not applicable.","order":7,"name":"Ethics","group":{"name":"EthicsHeading","label":"Consent for publication"}}],"article-number":"168"}}