{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,5,14]],"date-time":"2025-05-14T02:44:48Z","timestamp":1747190688358,"version":"3.40.5"},"reference-count":30,"publisher":"Wiley","license":[{"start":{"date-parts":[[2021,5,13]],"date-time":"2021-05-13T00:00:00Z","timestamp":1620864000000},"content-version":"unspecified","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Mobile Information Systems"],"published-print":{"date-parts":[[2021,5,13]]},"abstract":"<jats:p>Neural machine translation has been widely concerned in recent years. The traditional sequential neural network framework of English translation has obvious disadvantages because of its poor ability to capture long-distance information, and the current improved framework, such as the recurrent neural network, still cannot solve this problem very well. In this paper, we propose a hybrid neural network that combines the convolutional neural network (CNN) and long short-term memory (LSTM) and introduce the attention mechanism based on the encoder-decoder structure to improve the translation accuracy, especially for long sentences. In the experiment, this model is implemented based on TensorFlow, and the results show that the BLEU value of the proposed method is obviously improved compared with the traditional machine learning model, which proves the effectiveness of our method in English-Chinese translation.<\/jats:p>","DOI":"10.1155\/2021\/9985251","type":"journal-article","created":{"date-parts":[[2021,5,15]],"date-time":"2021-05-15T17:35:07Z","timestamp":1621100107000},"page":"1-10","source":"Crossref","is-referenced-by-count":1,"title":["Efficient English Translation Method and Analysis Based on the Hybrid Neural Network"],"prefix":"10.1155","volume":"2021","author":[{"ORCID":"https:\/\/orcid.org\/0000-0002-0066-5630","authenticated-orcid":true,"given":"Chuncheng","family":"Wang","sequence":"first","affiliation":[{"name":"Tongling University, Tongling 244061, China"}]}],"member":"311","reference":[{"author":"L. Lemao","key":"1","article-title":"Additive neural networks for statistical machine translation"},{"author":"D. Xiong","key":"2","article-title":"Enhancing language models in statistical machine translation with backward N-grams and mutual information triggers"},{"author":"P. Li","key":"3","article-title":"Recursive autoencoders for ITG-based translation"},{"key":"4","doi-asserted-by":"crossref","DOI":"10.21437\/Interspeech.2010-487","article-title":"Binary coding of speech spectrograms using a deep auto-encoder","author":"L. Deng","year":"2010","journal-title":"Interspeech"},{"author":"M. Junczys-Dowmunt","key":"5","article-title":"Is neural machine translation ready for deployment? A case study on 30 translation directions"},{"article-title":"Edinburgh neural machine translation systems for WMT 16","author":"R. Sennrich","key":"6","doi-asserted-by":"crossref","DOI":"10.18653\/v1\/W16-2323"},{"article-title":"Google\u2019s neural machine translation system: bridging the gap between human and machine translation","year":"2016","author":"Y. Wu","key":"7"},{"issue":"2","key":"8","article-title":"Deep recurrent models with fast-forward connections for neural machine translatin","volume":"4","author":"Z. Jie","year":"2016","journal-title":"Transactions of the Association for Computational Linguistics"},{"author":"N. Kalchbrenner","key":"9","article-title":"Recurrent continuous translation models"},{"author":"J. Brea","key":"10","article-title":"Sequence learning with hidden units in spiking neural networks"},{"author":"Z. Huang","key":"11","article-title":"Soft syntactic constraints for hierarchical phrase-based translation using latent syntactic distributions"},{"author":"I. Sutskever","key":"12","article-title":"Sequence to sequence learning with neural networks"},{"article-title":"Learning phrase representations using RNN encoder-decoder for statistical machine translation","year":"2014","author":"K. Cho","key":"13"},{"article-title":"Neural versus phrase-based machine translation quality: a case study","author":"L. Bentivogli","key":"14","doi-asserted-by":"crossref","DOI":"10.18653\/v1\/D16-1025"},{"author":"Z. Tu","key":"15","article-title":"Modeling coverage for neural machine translation"},{"author":"G. Druck","key":"16","article-title":"Rich prior knowledge in learning for natural language processing"},{"article-title":"Neural machine translation with reconstruction","year":"2016","author":"Z. Tu","key":"17"},{"author":"C. Yong","key":"18","article-title":"Semi-supervised learning for neural machine translation"},{"article-title":"An iterative deep learning framework for unsupervised discovery of speech features and linguistic units with applications on spoken term detection","year":"2016","author":"C. T. Chung","key":"19"},{"key":"20","first-page":"2048","article-title":"Show, attend and tell: neural image caption generation with visual attention","volume":"37","author":"K. Xu","year":"2015","journal-title":"Computer Science"},{"article-title":"Effective approaches to attention-based neural machine translation","year":"2015","author":"M. T. Luong","key":"21"},{"author":"L. Liu","key":"22","article-title":"Neural machine translation with supervised attention"},{"key":"23","doi-asserted-by":"crossref","DOI":"10.21437\/Interspeech.2012-618","article-title":"Unsupervised deep belief features for speech translation","author":"S. Maskey","year":"2012","journal-title":"Interspeech"},{"key":"24","doi-asserted-by":"publisher","DOI":"10.1016\/j.neucom.2018.08.055"},{"key":"25","article-title":"Research on remote sensing image classification based on improved decision tree classification algorithm","volume":"41","author":"Y. Bo","year":"2018","journal-title":"Computer Measurement & Control"},{"key":"26","doi-asserted-by":"publisher","DOI":"10.1109\/access.2017.2779939"},{"key":"27","doi-asserted-by":"publisher","DOI":"10.1016\/j.asoc.2020.106070"},{"key":"28","article-title":"Study on evaluation of tense accuracy in CNN-based google translation from English to Chinese","author":"X. Zhou","year":"2019","journal-title":"Modern Electronics Technique"},{"article-title":"Multichannel LSTM-CNN for Telugu technical domain identification","year":"2021","author":"S. Gundapu","key":"29"},{"article-title":"Attention-based LSTM for aspect-level sentiment classification","author":"Y. Wang","key":"30","doi-asserted-by":"crossref","DOI":"10.18653\/v1\/D16-1058"}],"container-title":["Mobile Information Systems"],"original-title":[],"language":"en","link":[{"URL":"http:\/\/downloads.hindawi.com\/journals\/misy\/2021\/9985251.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"http:\/\/downloads.hindawi.com\/journals\/misy\/2021\/9985251.xml","content-type":"application\/xml","content-version":"vor","intended-application":"text-mining"},{"URL":"http:\/\/downloads.hindawi.com\/journals\/misy\/2021\/9985251.pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2022,12,27]],"date-time":"2022-12-27T13:26:30Z","timestamp":1672147590000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.hindawi.com\/journals\/misy\/2021\/9985251\/"}},"subtitle":[],"editor":[{"given":"Jianhui","family":"Lv","sequence":"additional","affiliation":[]}],"short-title":[],"issued":{"date-parts":[[2021,5,13]]},"references-count":30,"alternative-id":["9985251","9985251"],"URL":"https:\/\/doi.org\/10.1155\/2021\/9985251","relation":{},"ISSN":["1875-905X","1574-017X"],"issn-type":[{"type":"electronic","value":"1875-905X"},{"type":"print","value":"1574-017X"}],"subject":[],"published":{"date-parts":[[2021,5,13]]}}}