{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,1,31]],"date-time":"2026-01-31T11:10:27Z","timestamp":1769857827663,"version":"3.49.0"},"reference-count":48,"publisher":"Springer Science and Business Media LLC","issue":"2","license":[{"start":{"date-parts":[[2022,11,4]],"date-time":"2022-11-04T00:00:00Z","timestamp":1667520000000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"},{"start":{"date-parts":[[2022,11,4]],"date-time":"2022-11-04T00:00:00Z","timestamp":1667520000000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"}],"funder":[{"DOI":"10.13039\/501100001809","name":"National Natural Science Foundation of China","doi-asserted-by":"publisher","award":["61772429"],"award-info":[{"award-number":["61772429"]}],"id":[{"id":"10.13039\/501100001809","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100001809","name":"National Natural Science Foundation of China","doi-asserted-by":"publisher","award":["61872296"],"award-info":[{"award-number":["61872296"]}],"id":[{"id":"10.13039\/501100001809","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100001809","name":"National Natural Science Foundation of China","doi-asserted-by":"publisher","award":["U20B2065"],"award-info":[{"award-number":["U20B2065"]}],"id":[{"id":"10.13039\/501100001809","id-type":"DOI","asserted-by":"publisher"}]},{"name":"MOE (Ministry of Education in China) Project of Humanities and Social Sciences","award":["18YJC870001"],"award-info":[{"award-number":["18YJC870001"]}]}],"content-domain":{"domain":["link.springer.com"],"crossmark-restriction":false},"short-container-title":["Complex Intell. Syst."],"published-print":{"date-parts":[[2023,4]]},"abstract":"<jats:title>Abstract<\/jats:title><jats:p>Text generation is a key tool in natural language applications. Generating texts which could express rich ideas through several sentences needs a structured representation of their content. Many works utilize graph-based methods for graph-to-text generation, like knowledge-graph-to-text generation. However, generating texts from knowledge graph still faces problems, such as repetitions and the entity information is not fully utilized in the generated text. In this paper, we focus on knowledge-graph-to-text generation, and develop a multi-level entity fusion representation (MEFR) model to address the above problems, aiming to generate high-quality text from knowledge graph. Our model introduces a fusion mechanism, which is capable of aggregating node representations from word level and phrase level to obtain rich entity representations of the knowledge graph. Then, Graph Transformer is adopted to encode the graph and outputs contextualized node representations. Besides, we develop a vanilla beam search-based comparison mechanism during decoding procedure, which further considers similarity to reduce repetitive information of the generated text. Experimental results show that the proposed MEFR model could effectively improve generation performance, and outperform other baselines on AGENDA and WebNLG datasets. The results also demonstrate the importance to further explore information contained in knowledge graph.<\/jats:p>","DOI":"10.1007\/s40747-022-00898-0","type":"journal-article","created":{"date-parts":[[2022,11,4]],"date-time":"2022-11-04T06:02:31Z","timestamp":1667541751000},"page":"2019-2030","update-policy":"https:\/\/doi.org\/10.1007\/springer_crossmark_policy","source":"Crossref","is-referenced-by-count":4,"title":["Enriched entity representation of knowledge graph for text generation"],"prefix":"10.1007","volume":"9","author":[{"given":"Kaile","family":"Shi","sequence":"first","affiliation":[]},{"ORCID":"https:\/\/orcid.org\/0000-0002-1406-107X","authenticated-orcid":false,"given":"Xiaoyan","family":"Cai","sequence":"additional","affiliation":[]},{"given":"Libin","family":"Yang","sequence":"additional","affiliation":[]},{"given":"Jintao","family":"Zhao","sequence":"additional","affiliation":[]}],"member":"297","published-online":{"date-parts":[[2022,11,4]]},"reference":[{"key":"898_CR1","doi-asserted-by":"publisher","DOI":"10.1017\/CBO9780511519857","volume-title":"Building natural language generation systems, studies in natural language processing","author":"E Reiter","year":"2000","unstructured":"Reiter E, Dale R (2000) Building natural language generation systems, studies in natural language processing. Cambridge University Press, Cambridge. https:\/\/doi.org\/10.1017\/CBO9780511519857"},{"key":"898_CR2","doi-asserted-by":"publisher","unstructured":"Hu Y, Wan X (2014) Automatic generation of related work sections in scientific papers: an optimization approach. In: Proceedings of the 2014 Conference on empirical methods in natural language processing (EMNLP), Association for Computational Linguistics, pp 1624\u20131633. https:\/\/doi.org\/10.3115\/v1\/D14-1170","DOI":"10.3115\/v1\/D14-1170"},{"key":"898_CR3","doi-asserted-by":"crossref","unstructured":"Mei H, Bansal M, Walter MR (2016) What to talk about and how? Selective generation using lstms with coarse-to-fine alignment. arXiv:1509.00838","DOI":"10.18653\/v1\/N16-1086"},{"key":"898_CR4","doi-asserted-by":"crossref","unstructured":"Vinyals O, Toshev A, Bengio S, Erhan D (2015) Show and tell: a neural image caption generator. In: Proceedings of the IEEE conference on computer vision and pattern recognition (CVPR). IEEE Computer Society, pp 3156\u20133164","DOI":"10.1109\/CVPR.2015.7298935"},{"key":"898_CR5","doi-asserted-by":"publisher","DOI":"10.1016\/j.jksuci.2020.04.001","author":"T Iqbal","year":"2020","unstructured":"Iqbal T, Qureshi S (2020) The survey: text generation models in deep learning. J King Saud Univ Comput Inf Sci. https:\/\/doi.org\/10.1016\/j.jksuci.2020.04.001","journal-title":"J King Saud Univ Comput Inf Sci"},{"key":"898_CR6","doi-asserted-by":"publisher","unstructured":"Nie F, Wang J, Yao J-G, Pan R, Lin C-Y (2018) Operation-guided neural networks for high fidelity data-to-text generation. In: Proceedings of the 2018 Conference on empirical methods in natural language processing, association for computational linguistics, pp 3879\u20133889. https:\/\/doi.org\/10.18653\/v1\/D18-1422","DOI":"10.18653\/v1\/D18-1422"},{"key":"898_CR7","doi-asserted-by":"publisher","unstructured":"Wiseman S, Shieber S, Rush A (2018) Learning neural templates for text generation. In: Proceedings of the 2018 Conference on empirical methods in natural language processing, Association for Computational Linguistics, pp 3174\u20133187. https:\/\/doi.org\/10.18653\/v1\/D18-1356","DOI":"10.18653\/v1\/D18-1356"},{"key":"898_CR8","unstructured":"Li L, Wan X (2018) Point precisely: towards ensuring the precision of data in generated texts using delayed copy mechanism. In: Proceedings of the 27th International Conference on computational linguistics, Association for Computational Linguistics, pp 1044\u20131055"},{"key":"898_CR9","doi-asserted-by":"publisher","unstructured":"Puduppully R, Dong L, Lapata M (2019) Data-to-text generation with content selection and planning. In: Proceedings of the 33rd AAAI Conference on artificial intelligence, pp 6908\u20136915. https:\/\/doi.org\/10.1609\/aaai.v33i01.33016908","DOI":"10.1609\/aaai.v33i01.33016908"},{"key":"898_CR10","doi-asserted-by":"publisher","first-page":"10","DOI":"10.1007\/s40747-021-00332-x","volume":"15","author":"A Mohan","year":"2021","unstructured":"Mohan A, Pramod KV (2021) Temporal network embedding using graph attention network. Complex Intell Syst 15:10. https:\/\/doi.org\/10.1007\/s40747-021-00332-x","journal-title":"Complex Intell Syst"},{"key":"898_CR11","doi-asserted-by":"publisher","first-page":"10","DOI":"10.1007\/s40747-021-00343-8","volume":"15","author":"Z Huang","year":"2021","unstructured":"Huang Z, Xie Z (2021) A patent keywords extraction method using textrank model with prior public knowledge. Complex Intell Syst 15:10. https:\/\/doi.org\/10.1007\/s40747-021-00343-8","journal-title":"Complex Intell Syst"},{"key":"898_CR12","unstructured":"Xu K, Wu L, Wang Z, Feng Y, Witbrock M, Sheinin V (2018) Graph2seq: graph to sequence learning with attention-based neural networks. arXiv:1804.00823"},{"key":"898_CR13","doi-asserted-by":"publisher","unstructured":"Beck D, Haffari G, Cohn T (2018) Graph-to-sequence learning using gated graph neural networks. In: Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Association for Computational Linguistics, pp 273\u2013283. https:\/\/doi.org\/10.18653\/v1\/P18-1026","DOI":"10.18653\/v1\/P18-1026"},{"key":"898_CR14","doi-asserted-by":"publisher","unstructured":"Li W, Xu J, He Y, Yan S, Wu Y, Sun X (2019) Coherent comments generation for Chinese articles with a graph-to-sequence model. In: Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, Association for Computational Linguistics, pp 4843\u20134852. https:\/\/doi.org\/10.18653\/v1\/P19-1479","DOI":"10.18653\/v1\/P19-1479"},{"key":"898_CR15","doi-asserted-by":"publisher","first-page":"589","DOI":"10.1162\/tacl_a_00332","volume":"8","author":"LF Ribeiro","year":"2020","unstructured":"Ribeiro LF, Zhang Y, Gardent C, Gurevych I (2020) Modeling global and local node contexts for text generation from knowledge graphs. Trans Assoc Comput Linguist 8:589\u2013604","journal-title":"Trans Assoc Comput Linguist"},{"key":"898_CR16","doi-asserted-by":"publisher","unstructured":"Koncel-Kedziorski R, Bekal D, Luan Y, Lapata M, Hajishirzi H (2019) Text Generation from Knowledge Graphs with Graph Transformers, In: Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), Association for Computational Linguistics, pp 2284\u20132293. https:\/\/doi.org\/10.18653\/v1\/N19-1238","DOI":"10.18653\/v1\/N19-1238"},{"key":"898_CR17","unstructured":"Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser Lu, Polosukhin I (2017) Attention is all you need. In: Proceedings of the 31st international conference on neural information processing systems, vol 30. Curran Associates, Inc, pp 6000\u20136010"},{"key":"898_CR18","unstructured":"Veli\u010dkovi\u0107 P, Cucurull G, Casanova A, Romero A, Li\u00f2 P, Bengio Y (2018) Graph attention networks. arXiv:1710.10903"},{"key":"898_CR19","doi-asserted-by":"crossref","unstructured":"Graves A (2012) Sequence transduction with recurrent neural networks. arXiv:1211.3711","DOI":"10.1007\/978-3-642-24797-2"},{"key":"898_CR20","unstructured":"Sutskever I, Vinyals O, Le QV (2014) Sequence to sequence learning with neural networks. In: Proceedings of the 27th International Conference on Neural information processing systems 2:3104\u20133112"},{"key":"898_CR21","unstructured":"Angeli G, Liang P, Klein D (2010) A simple domain-independent probabilistic approach to generation. In: Proceedings of the 2010 Conference on empirical methods in natural language processing, Association for Computational Linguistics, pp 502\u2013512"},{"key":"898_CR22","unstructured":"Kondadadi R, Howald B, Schilder F (2013) A statistical NLG framework for aggregated planning and realization. In: Proceedings of the 51st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Association for Computational Linguistics, pp 1406\u20131415"},{"key":"898_CR23","unstructured":"Howald B, Kondadadi R, Schilder F (2013) Domain adaptable semantic clustering in statistical NLG. In: Proceedings of the 10th International Conference on computational semantics (IWCS 2013)\u2014Long Papers, Association for Computational Linguistics, pp 143\u2013154"},{"key":"898_CR24","doi-asserted-by":"publisher","unstructured":"Juraska J, Karagiannis P, Bowden K, Walker M (2018) A deep ensemble model with slot alignment for sequence-to-sequence natural language generation. In: Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long Papers), Association for Computational Linguistics, pp 152\u2013162. https:\/\/doi.org\/10.18653\/v1\/N18-1014","DOI":"10.18653\/v1\/N18-1014"},{"key":"898_CR25","doi-asserted-by":"publisher","unstructured":"Gehrmann S, Dai F, Elder H, Rush A (2018) End-to-end content and plan selection for data-to-text generation. In: Proceedings of the 11th International Conference on natural language generation, Association for Computational Linguistics pp 46\u201356. https:\/\/doi.org\/10.18653\/v1\/W18-6505","DOI":"10.18653\/v1\/W18-6505"},{"key":"898_CR26","doi-asserted-by":"publisher","unstructured":"Freitag M, Roy S (2018) Unsupervised natural language generation with denoising autoencoders. In: Proceedings of the 2018 Conference on empirical methods in natural language processing, Association for Computational Linguistics, pp 3922\u20133929. https:\/\/doi.org\/10.18653\/v1\/D18-1426","DOI":"10.18653\/v1\/D18-1426"},{"key":"898_CR27","doi-asserted-by":"publisher","unstructured":"Xu K, Wu L, Wang Z, Feng Y, Sheinin V (2018) SQL-to-text generation with graph-to-sequence model. In: Proceedings of the 2018 Conference on empirical methods in natural language processing, Association for Computational Linguistics, pp 931\u2013936. https:\/\/doi.org\/10.18653\/v1\/D18-1112","DOI":"10.18653\/v1\/D18-1112"},{"key":"898_CR28","doi-asserted-by":"publisher","unstructured":"Song L, Wang A, Su J, Zhang Y, Xu K, Ge Y, Yu D (2018) Structural information preserving for graph-to-text generation. In: Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, Association for Computational Linguistics, pp 7987\u20137998. https:\/\/doi.org\/10.18653\/v1\/2020.acl-main.712","DOI":"10.18653\/v1\/2020.acl-main.712"},{"key":"898_CR29","doi-asserted-by":"crossref","unstructured":"Guo Z, Zhang Y, Teng Z, Lu W (2019) Densely connected graph convolutional networks for graph-to-sequence learning. arXiv:1908.05957","DOI":"10.1162\/tacl_a_00269"},{"issue":"11","key":"898_CR30","doi-asserted-by":"publisher","first-page":"2673","DOI":"10.1109\/78.650093","volume":"45","author":"M Schuster","year":"1997","unstructured":"Schuster M, Paliwal KK (1997) Bidirectional recurrent neural networks. IEEE Trans Signal Process 45(11):2673\u20132681. https:\/\/doi.org\/10.1109\/78.650093","journal-title":"IEEE Trans Signal Process"},{"key":"898_CR31","doi-asserted-by":"publisher","unstructured":"Luong T, Pham H, Manning CD (2015) Effective approaches to attention-based neural machine translation. In: Proceedings of the 2015 Conference on empirical methods in natural language processing, Association for Computational Linguistics, pp 1412\u20131421. https:\/\/doi.org\/10.18653\/v1\/D15-1166","DOI":"10.18653\/v1\/D15-1166"},{"key":"898_CR32","doi-asserted-by":"publisher","unstructured":"See A, Liu PJ, Manning CD (2017) Get to the point: Summarization with pointer-generator networks. In: Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Association for Computational Linguistics, pp 1073\u20131083. https:\/\/doi.org\/10.18653\/v1\/P17-1099","DOI":"10.18653\/v1\/P17-1099"},{"key":"898_CR33","unstructured":"Srivastava RK, Greff K, Schmidhuber J (2015) Highway networks. arXiv:1505.00387"},{"key":"898_CR34","doi-asserted-by":"crossref","unstructured":"Luong M-T, Pham H, Manning CD (2015) Effective approaches to attention-based neural machine translation. In: Proceedings of the 2015 Conference on empirical methods in natural language processing, pp 1412\u20131421","DOI":"10.18653\/v1\/D15-1166"},{"key":"898_CR35","doi-asserted-by":"publisher","unstructured":"Ammar W, Groeneveld D, Bhagavatula C, Beltagy I, Crawford M, Downey D, Dunkelberger J, Elgohary A, Feldman S, Ha V, Kinney R, Kohlmeier S, Lo K, Murray T, Ooi H-H, Peters M, Power J, Skjonsberg S, Wang L, Wilhelm C, Yuan Z, van Zuylen M, Etzioni O (2018) Construction of the literature graph in semantic scholar. In: Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 3 (Industry Papers), Association for Computational Linguistics, pp 84\u201391. https:\/\/doi.org\/10.18653\/v1\/N18-3011","DOI":"10.18653\/v1\/N18-3011"},{"key":"898_CR36","doi-asserted-by":"crossref","unstructured":"Gardent C, Shimorina A, Narayan S, Perez-Beltrachini L (2017) The webnlg challenge: Generating text from rdf data. In: Proceedings of the 10th International Conference on natural language generation, pp 124\u2013133","DOI":"10.18653\/v1\/W17-3518"},{"key":"898_CR37","doi-asserted-by":"crossref","unstructured":"Auer S, Bizer C, Kobilarov G, Lehmann J, Cyganiak R, Ives Z (2007) Dbpedia: a nucleus for a web of open data. In: Proceedings of the 6th international the semantic web and 2nd Asian conference on Asian semantic web conference. Springer, pp 722\u201373","DOI":"10.1007\/978-3-540-76298-0_52"},{"key":"898_CR38","doi-asserted-by":"publisher","DOI":"10.1162\/neco.1997.9.8.1735","author":"S Hochreiter","year":"1997","unstructured":"Hochreiter S, Schmidhuber J (1997) Long short-term memory. Neural Comput. https:\/\/doi.org\/10.1162\/neco.1997.9.8.1735","journal-title":"Neural Comput"},{"key":"898_CR39","doi-asserted-by":"crossref","unstructured":"Prechelt L (1998) Early stopping-but when? In: Neural networks: tricks of the trade, this book is an Outgrowth of a 1996 NIPS Workshop, Springer-Verlag, pp 55\u201369","DOI":"10.1007\/3-540-49430-8_3"},{"key":"898_CR40","doi-asserted-by":"publisher","DOI":"10.1016\/S0893-6080(98)00116-6","author":"N Qian","year":"1999","unstructured":"Qian N (1999) On the momentum term in gradient descent learning algorithms. Neural Netw. https:\/\/doi.org\/10.1016\/S0893-6080(98)00116-6","journal-title":"Neural Netw"},{"key":"898_CR41","doi-asserted-by":"publisher","unstructured":"Papineni K, Roukos S, Ward T, Zhu W-J (2002) Bleu: a method for automatic evaluation of machine translation. In: Proceedings of the 40th Annual Meeting on Association for Computational Linguistics, Association for Computational Linguistics, pp 311\u2013318. https:\/\/doi.org\/10.3115\/1073083.1073135","DOI":"10.3115\/1073083.1073135"},{"key":"898_CR42","unstructured":"Lin C (2004) Rouge: a package for automatic evaluation of summaries. In: Proceedings of workshop on text summarization branches out, post conference workshop of ACL 2004. Association for Computational Linguistics, pp 74\u201381"},{"key":"898_CR43","doi-asserted-by":"crossref","unstructured":"Marcheggiani D, Perez-Beltrachini L (2018) Deep graph convolutional encoders for structured data to text generation. arXiv:1810.09995","DOI":"10.18653\/v1\/W18-6501"},{"key":"898_CR44","unstructured":"An B, Dong X, Chen C (2019) Repulsive Bayesian sampling for diversified attention modeling. In: 4th workshop on Bayesian deep learning (NeurIPS 2019), pp 1\u201310"},{"key":"898_CR45","doi-asserted-by":"crossref","unstructured":"Schmitt M, Ribeiro LF, Dufter P, Gurevych I, Sch\u00fctze H (2020) Modeling graph structure via relative position for text generation from knowledge graphs. arXiv preprint arXiv:2006.09242","DOI":"10.18653\/v1\/11.textgraphs-1.2"},{"key":"898_CR46","unstructured":"Ferreira TC, van der Lee C, Van Miltenburg E, Krahmer E (2019) Neural data-to-text generation: a comparison between pipeline and end-to-end architectures. In: Proceedings of the 2019 Conference on empirical methods in natural language processing and the 9th international joint conference on natural language processing (EMNLP-IJCNLP), vol 1, pp 552\u2013562"},{"key":"898_CR47","unstructured":"Distiawan B, Qi J, Zhang R, Wang W (2018) Gtr-lstm: a triple encoder for sentence generation from rdf data. In: Proceedings of the 56th annual meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pp 1627\u20131637"},{"key":"898_CR48","unstructured":"Moryossef A, Goldberg Y, Dagan I (2019) Step-by-step: separating planning from realization in neural data-to-text generation. In: Proceedings of the 2019 conference of the North American chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1 (Long and Short Papers), pp 2267\u20132277"}],"container-title":["Complex &amp; Intelligent Systems"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s40747-022-00898-0.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/article\/10.1007\/s40747-022-00898-0\/fulltext.html","content-type":"text\/html","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s40747-022-00898-0.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2023,4,18]],"date-time":"2023-04-18T09:40:44Z","timestamp":1681810844000},"score":1,"resource":{"primary":{"URL":"https:\/\/link.springer.com\/10.1007\/s40747-022-00898-0"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2022,11,4]]},"references-count":48,"journal-issue":{"issue":"2","published-print":{"date-parts":[[2023,4]]}},"alternative-id":["898"],"URL":"https:\/\/doi.org\/10.1007\/s40747-022-00898-0","relation":{},"ISSN":["2199-4536","2198-6053"],"issn-type":[{"value":"2199-4536","type":"print"},{"value":"2198-6053","type":"electronic"}],"subject":[],"published":{"date-parts":[[2022,11,4]]},"assertion":[{"value":"16 November 2021","order":1,"name":"received","label":"Received","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"17 October 2022","order":2,"name":"accepted","label":"Accepted","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"4 November 2022","order":3,"name":"first_online","label":"First Online","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"We declare that we do not have any commercial or associative interest that represents a conflict of interest in connection with the work submitted.","order":1,"name":"Ethics","group":{"name":"EthicsHeading","label":"Declarations"}}]}}