{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,2,13]],"date-time":"2026-02-13T17:15:44Z","timestamp":1771002944918,"version":"3.50.1"},"reference-count":16,"publisher":"SAGE Publications","issue":"6","license":[{"start":{"date-parts":[[2024,11,1]],"date-time":"2024-11-01T00:00:00Z","timestamp":1730419200000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/journals.sagepub.com\/page\/policies\/text-and-data-mining-license"}],"content-domain":{"domain":["journals.sagepub.com"],"crossmark-restriction":true},"short-container-title":["Journal of Computational Methods in Sciences and Engineering"],"published-print":{"date-parts":[[2024,11]]},"abstract":"<jats:p>In the current landscape, there is a growing urgency for advanced text generation technology, prompting a surge of interest in the exploration of highly efficient text generative models based on deep learning. In this paper, the long short-term memory (LSTM) unit was selected as the text generator and used as the decoder to generate its own retelling text. The feature extractor was BERT (Bidirectional Encoder Representation from Transformers), and the embedding layer of BERT included three parts: words, fragments, and locations. The BLEU (Bilingual Evaluation Understudy) score was used to measure the similarity between real text and generated text. The BLEU score for text paraphrasing generation under the LSTM network method was 0.98, while the BLEU score for text paraphrasing generation under the Transformer network method was 0.89. The BLEU score of text paraphrase generation under the Transformer-based sequence to sequence model method was 0.97, and the BLEU score of text paraphrase generation under the Transformer\u2019s general text generative model method was 0.96. The LSTM network method was the best method compared with other methods. This article met the technical requirements of modern society for generating abstracts, and the readability and completeness of abstracts were further improved.<\/jats:p>","DOI":"10.1177\/14727978241299204","type":"journal-article","created":{"date-parts":[[2025,1,31]],"date-time":"2025-01-31T04:22:21Z","timestamp":1738297341000},"page":"4089-4100","update-policy":"https:\/\/doi.org\/10.1177\/sage-journals-update-policy","source":"Crossref","is-referenced-by-count":1,"title":["Investigation on text generative model based on deep learning in natural language processing"],"prefix":"10.1177","volume":"24","author":[{"given":"Xianqiu","family":"Zheng","sequence":"first","affiliation":[{"name":"Guilin University of Electronic Technology"},{"name":"Shanxi Institute of Technology"}]},{"given":"Zhidong","family":"Zhang","sequence":"additional","affiliation":[{"name":"Shanxi Institute of Technology"}]},{"given":"Liqin","family":"Wang","sequence":"additional","affiliation":[{"name":"Shanxi Institute of Technology"}]},{"given":"Jinhua","family":"Wu","sequence":"additional","affiliation":[{"name":"Shanxi Institute of Technology"}]},{"given":"Zuofeng","family":"Dong","sequence":"additional","affiliation":[{"name":"Shanxi Institute of Technology"}]}],"member":"179","published-online":{"date-parts":[[2024,11,21]]},"reference":[{"issue":"6","key":"e_1_3_2_2_2","first-page":"1398","article-title":"Question answer selection based on multiscale similarity feature","volume":"40","author":"Chen KJ","year":"2018","unstructured":"Chen KJ, Hou JA, Guo Z, et al. Question answer selection based on multiscale similarity feature. Syst Eng Electron 2018; 40(6): 1398\u20131404.","journal-title":"Syst Eng Electron"},{"key":"e_1_3_2_3_2","doi-asserted-by":"publisher","DOI":"10.1007\/s10844-022-00757-x"},{"issue":"16","key":"e_1_3_2_4_2","first-page":"2001","article-title":"A performance study of text summarization model using heterogeneous data sources","volume":"119","author":"Sabir S","year":"2018","unstructured":"Sabir S, Shanmugasundaram H. A performance study of text summarization model using heterogeneous data sources. Int J Pure Appl Math: IJPAM 2018; 119(16): 2001\u20132007.","journal-title":"Int J Pure Appl Math: IJPAM"},{"issue":"5","key":"e_1_3_2_5_2","first-page":"855","article-title":"Research and application of discriminant enhanced generation antagonism model in text to image generation","volume":"44","author":"Tan HC","year":"2022","unstructured":"Tan HC, Huang SH, Xiao HW, et al. Research and application of discriminant enhanced generation antagonism model in text to image generation. Comp Eng Sci 2022; 44(5): 855\u2013861.","journal-title":"Comp Eng Sci"},{"issue":"6","key":"e_1_3_2_6_2","first-page":"1083","article-title":"A text generation image model based on spectrum normalization and two-stage stack structure Generative adversarial network","volume":"44","author":"Wang X","year":"2022","unstructured":"Wang X, Xu HY, Zhu XZ. A text generation image model based on spectrum normalization and two-stage stack structure Generative adversarial network. Comp Eng Sci 2022; 44(6): 1083\u20131089.","journal-title":"Comp Eng Sci"},{"issue":"012","key":"e_1_3_2_7_2","first-page":"3445","article-title":"Chinese text abstract generation based on fa-tr model","volume":"042","author":"Gao W","year":"2021","unstructured":"Gao W, Ma H, Li DZ, et al. Chinese text abstract generation based on fa-tr model. Comp Eng Des 2021; 042(012): 3445\u20133452.","journal-title":"Comp Eng Des"},{"key":"e_1_3_2_8_2","doi-asserted-by":"publisher","DOI":"10.1002\/widm.1345"},{"key":"e_1_3_2_9_2","doi-asserted-by":"publisher","DOI":"10.5815\/ijmecs.2018.09.02"},{"key":"e_1_3_2_10_2","doi-asserted-by":"publisher","DOI":"10.1007\/s13042-022-01553-3"},{"issue":"10","key":"e_1_3_2_11_2","first-page":"75292861","article-title":"An integrated deep generative model for text classification and generation","author":"Wang Z","year":"2018","unstructured":"Wang Z, Wu QB. An integrated deep generative model for text classification and generation. Math Probl Eng 2018; PT(10): 75292861\u2013775292868.","journal-title":"Math Probl Eng"},{"key":"e_1_3_2_12_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.neucom.2018.02.034"},{"issue":"12","key":"e_1_3_2_13_2","first-page":"155","article-title":"Document summary optimization generation based on joint information sharing with self-attention","volume":"5","author":"Di J","year":"2019","unstructured":"Di J, Qi RJ. Document summary optimization generation based on joint information sharing with self-attention. Int Core J Eng 2019; 5(12): 155\u2013159.","journal-title":"Int Core J Eng"},{"key":"e_1_3_2_14_2","doi-asserted-by":"publisher","DOI":"10.4149\/cai_2018_5_1126"},{"issue":"4","key":"e_1_3_2_15_2","first-page":"90","article-title":"News summary generation framework based on ontology and natural language processing techniques","volume":"9","author":"George SK","year":"2019","unstructured":"George SK, Raj VP. News summary generation framework based on ontology and natural language processing techniques. Comp Eng Inf Technol 2019; 9(4): 90\u201395.","journal-title":"Comp Eng Inf Technol"},{"issue":"7","key":"e_1_3_2_16_2","first-page":"1570","article-title":"Automatic generation of comparative summary for scientific literature","volume":"14","author":"Liu Y","year":"2018","unstructured":"Liu Y, Yang YQ, Huang Y. Automatic generation of comparative summary for scientific literature. Int J Perform Eng 2018; 14(7): 1570\u20131579.","journal-title":"Int J Perform Eng"},{"issue":"2","key":"e_1_3_2_17_2","first-page":"112","article-title":"A text generation image method based on conditional enhancement and attention mechanism","volume":"37","author":"Zhang J","year":"2023","unstructured":"Zhang J, Zhang LH. A text generation image method based on conditional enhancement and attention mechanism. J Test Meas Technol 2023; 37(2): 112\u2013119.","journal-title":"J Test Meas Technol"}],"container-title":["Journal of Computational Methods in Sciences and Engineering"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/journals.sagepub.com\/doi\/pdf\/10.1177\/14727978241299204","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/journals.sagepub.com\/doi\/full-xml\/10.1177\/14727978241299204","content-type":"application\/xml","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/journals.sagepub.com\/doi\/pdf\/10.1177\/14727978241299204","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2026,2,13]],"date-time":"2026-02-13T16:31:02Z","timestamp":1771000262000},"score":1,"resource":{"primary":{"URL":"https:\/\/journals.sagepub.com\/doi\/10.1177\/14727978241299204"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2024,11]]},"references-count":16,"journal-issue":{"issue":"6","published-print":{"date-parts":[[2024,11]]}},"alternative-id":["10.1177\/14727978241299204"],"URL":"https:\/\/doi.org\/10.1177\/14727978241299204","relation":{},"ISSN":["1472-7978","1875-8983"],"issn-type":[{"value":"1472-7978","type":"print"},{"value":"1875-8983","type":"electronic"}],"subject":[],"published":{"date-parts":[[2024,11]]}}}