{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,20]],"date-time":"2026-03-20T15:30:55Z","timestamp":1774020655285,"version":"3.50.1"},"reference-count":25,"publisher":"Wiley","license":[{"start":{"date-parts":[[2022,1,7]],"date-time":"2022-01-07T00:00:00Z","timestamp":1641513600000},"content-version":"unspecified","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Scientific Programming"],"published-print":{"date-parts":[[2022,1,7]]},"abstract":"<jats:p>As one of the core tasks in the field of natural language processing, syntactic analysis has always been a hot topic for researchers, including tasks such as Questions and Answer (Q&amp;A), Search String Comprehension, Semantic Analysis, and Knowledge Base Construction. This paper aims to study the application of deep learning and neural network in natural language syntax analysis, which has significant research and application value. This paper first studies a transfer-based dependent syntax analyzer using a feed-forward neural network as a classifier. By analyzing the model, we have made meticulous parameters of the model to improve its performance. This paper proposes a dependent syntactic analysis model based on a long-term memory neural network. This model is based on the feed-forward neural network model described above and will be used as a feature extractor. After the feature extractor is pretrained, we use a long short-term memory neural network as a classifier of the transfer action, and the characteristics extracted by the syntactic analyzer as its input to train a recursive neural network classifier optimized by sentences. The classifier can not only classify the current pattern feature but also multirich information such as analysis of state history. Therefore, the model is modeled in the analysis process of the entire sentence in syntactic analysis, replacing the method of modeling independent analysis. The experimental results show that the model has achieved greater performance improvement than baseline methods.<\/jats:p>","DOI":"10.1155\/2022\/6028693","type":"journal-article","created":{"date-parts":[[2022,1,7]],"date-time":"2022-01-07T17:05:22Z","timestamp":1641575122000},"page":"1-8","source":"Crossref","is-referenced-by-count":10,"title":["Natural Language Processing with Improved Deep Learning Neural Networks"],"prefix":"10.1155","volume":"2022","author":[{"ORCID":"https:\/\/orcid.org\/0000-0001-5023-7696","authenticated-orcid":true,"given":"YiTao","family":"Zhou","sequence":"first","affiliation":[{"name":"Hubei Research Center for Language and Intelligent Information Processing, Wuhan University, Wuhan 430072, China"}]}],"member":"311","reference":[{"key":"1","volume-title":"Foundations of statistical natural language processing","author":"C. D. Manning","year":"1999"},{"key":"2","volume-title":"Statistical Natural Language Processing","author":"C. Q. Zong","year":"2008"},{"key":"3","first-page":"72","article-title":"Query Understanding Enhanced by Hierarchical Parsing structures","author":"J. Liu"},{"key":"4","doi-asserted-by":"crossref","article-title":"Assessing the impact of syntactic and semantic structures for answer passages reranking","author":"K. Tymoshenko","DOI":"10.1145\/2806416.2806490"},{"key":"5","volume-title":"Dependency Parsing Features for Semantic Parsing","author":"W. Monroe","year":"2014"},{"key":"6","volume-title":"Pattern Recognition and Machine Learning (Information Science and Statistics)","author":"C. M. Bishop","year":"2006"},{"key":"7","doi-asserted-by":"publisher","DOI":"10.1038\/nature14539"},{"key":"8","doi-asserted-by":"publisher","DOI":"10.1109\/tpami.2008.137"},{"key":"9","first-page":"746","article-title":"Linguistic regularities in continuous space word representations","author":"T. Mikolov"},{"key":"10","doi-asserted-by":"publisher","DOI":"10.1162\/neco.1997.9.8.1735"},{"key":"11","first-page":"3104","article-title":"Sequence to Sequence Learning with Neural Networks","author":"I. Sutskever"},{"key":"12","doi-asserted-by":"crossref","DOI":"10.21437\/Interspeech.2012-65","volume-title":"LSTM Neural Networks for Language Modeling","author":"M. Sundermeyer","year":"2012"},{"key":"13","first-page":"2204","article-title":"Recurrent models of visual attention","author":"V. Mnih"},{"key":"14","article-title":"Neural machine translation by jointly learning to align and translate","author":"D. Bahdanau","year":"2014"},{"key":"15","article-title":"A neural attention model for abstractive sentence summarization","author":"A. M. Rush","year":"2015"},{"key":"16","first-page":"193","article-title":"Neocognitron: a self-organizing neural network model for a mechanism of pattern recognition unaffected by shift in position","volume-title":"Biological Cybernetics","author":"K. Fukushima","year":"1980"},{"key":"17","first-page":"98","article-title":"Parallel distributed processing: explorations in the microstructure of cognition","volume-title":"Language","author":"D. E. Rumelhart","year":"1986"},{"key":"18","doi-asserted-by":"publisher","DOI":"10.1007\/s11263-015-0816-y"},{"key":"19","doi-asserted-by":"publisher","DOI":"10.1162\/153244303322533223"},{"key":"20","doi-asserted-by":"crossref","article-title":"Learning structured embeddings of knowledge bases","author":"A. Bordes","DOI":"10.1609\/aaai.v25i1.7917"},{"key":"21","article-title":"Efficient estimation of word representations in vector space","author":"T. Mikolov","year":"2013"},{"key":"22","article-title":"Convolutional neural networks for sentence classification","author":"Y. Kim","year":"2014"},{"key":"23","article-title":"An introduction to deep learning in natural language processing: models, techniques, and tools","volume":"470","author":"I. Lauriola","year":"2021 22","journal-title":"Neurocomputing"},{"key":"24","doi-asserted-by":"publisher","DOI":"10.1016\/bs.host.2018.07.006"},{"key":"25","article-title":"Improved semantic representations from tree-structured long short-term memory networks","author":"K. S. Tai","year":"2015"}],"container-title":["Scientific Programming"],"original-title":[],"language":"en","link":[{"URL":"http:\/\/downloads.hindawi.com\/journals\/sp\/2022\/6028693.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"http:\/\/downloads.hindawi.com\/journals\/sp\/2022\/6028693.xml","content-type":"application\/xml","content-version":"vor","intended-application":"text-mining"},{"URL":"http:\/\/downloads.hindawi.com\/journals\/sp\/2022\/6028693.pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2023,1,22]],"date-time":"2023-01-22T07:46:26Z","timestamp":1674373586000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.hindawi.com\/journals\/sp\/2022\/6028693\/"}},"subtitle":[],"editor":[{"given":"Rahman","family":"Ali","sequence":"additional","affiliation":[]}],"short-title":[],"issued":{"date-parts":[[2022,1,7]]},"references-count":25,"alternative-id":["6028693","6028693"],"URL":"https:\/\/doi.org\/10.1155\/2022\/6028693","relation":{},"ISSN":["1875-919X","1058-9244"],"issn-type":[{"value":"1875-919X","type":"electronic"},{"value":"1058-9244","type":"print"}],"subject":[],"published":{"date-parts":[[2022,1,7]]}}}