{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T01:51:02Z","timestamp":1760147462747,"version":"build-2065373602"},"reference-count":34,"publisher":"MDPI AG","issue":"2","license":[{"start":{"date-parts":[[2023,2,6]],"date-time":"2023-02-06T00:00:00Z","timestamp":1675641600000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"name":"National Key Research and Development Program of China","award":["2018YFC0830105","2018YFC0830101","2018YFC0830100","62266027","U21B2027","61972186","202202AD080003","202001AT070047","202201BE070001-021"],"award-info":[{"award-number":["2018YFC0830105","2018YFC0830101","2018YFC0830100","62266027","U21B2027","61972186","202202AD080003","202001AT070047","202201BE070001-021"]}]},{"name":"National Natural Science Foundation of China","award":["2018YFC0830105","2018YFC0830101","2018YFC0830100","62266027","U21B2027","61972186","202202AD080003","202001AT070047","202201BE070001-021"],"award-info":[{"award-number":["2018YFC0830105","2018YFC0830101","2018YFC0830100","62266027","U21B2027","61972186","202202AD080003","202001AT070047","202201BE070001-021"]}]},{"name":"Yunnan provincial major science and technology special plan projects","award":["2018YFC0830105","2018YFC0830101","2018YFC0830100","62266027","U21B2027","61972186","202202AD080003","202001AT070047","202201BE070001-021"],"award-info":[{"award-number":["2018YFC0830105","2018YFC0830101","2018YFC0830100","62266027","U21B2027","61972186","202202AD080003","202001AT070047","202201BE070001-021"]}]},{"name":"general projects of basic research in Yunnan Province","award":["2018YFC0830105","2018YFC0830101","2018YFC0830100","62266027","U21B2027","61972186","202202AD080003","202001AT070047","202201BE070001-021"],"award-info":[{"award-number":["2018YFC0830105","2018YFC0830101","2018YFC0830100","62266027","U21B2027","61972186","202202AD080003","202001AT070047","202201BE070001-021"]}]},{"name":"Kunming University of Science and Technology","award":["2018YFC0830105","2018YFC0830101","2018YFC0830100","62266027","U21B2027","61972186","202202AD080003","202001AT070047","202201BE070001-021"],"award-info":[{"award-number":["2018YFC0830105","2018YFC0830101","2018YFC0830100","62266027","U21B2027","61972186","202202AD080003","202001AT070047","202201BE070001-021"]}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Information"],"abstract":"<jats:p>The summary of case\u2013public opinion refers to the generation of case-related sentences from public opinion information related to judicial cases. Case\u2013public opinion news refers to the judicial cases (intentional homicide, rape, etc.) that cause large public opinion. The public opinion news in these cases usually contains case element information such as the suspect, victim, time, place, process, and sentencing of the case. In the multi-document summary of case\u2013public opinion, due to the problem of information cross and information redundancy between different documents under the same case, in order to generate a concise and smooth summary, this paper proposes an abstractive summary model of case\u2013public opinion based on the attention of a case element diagram. Firstly, multiple public opinion documents in the same case are split into paragraphs, and then the paragraphs and case elements are coded based on the transformer method to construct a heterogeneous graph containing paragraph nodes and case element nodes. Finally, in the decoding process, the two-layer attention mechanism is applied to the case element node and paragraph node, so that the model can effectively solve the redundancy problem in summary generation.<\/jats:p>","DOI":"10.3390\/info14020097","type":"journal-article","created":{"date-parts":[[2023,2,6]],"date-time":"2023-02-06T05:29:05Z","timestamp":1675661345000},"page":"97","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":1,"title":["Abstractive Summary of Public Opinion News Based on Element Graph Attention"],"prefix":"10.3390","volume":"14","author":[{"given":"Yuxin","family":"Huang","sequence":"first","affiliation":[{"name":"Faculty of Information Engineering and Automation, Kunming University of Science and Technology, Kunming 650500, China"},{"name":"Yunnan Key Laboratory of Artificial Intelligence, Kunming University of Science and Technology, Kunming 650500, China"}]},{"given":"Shukai","family":"Hou","sequence":"additional","affiliation":[{"name":"Faculty of Information Engineering and Automation, Kunming University of Science and Technology, Kunming 650500, China"},{"name":"Yunnan Key Laboratory of Artificial Intelligence, Kunming University of Science and Technology, Kunming 650500, China"}]},{"given":"Gang","family":"Li","sequence":"additional","affiliation":[{"name":"Faculty of Information Engineering and Automation, Kunming University of Science and Technology, Kunming 650500, China"},{"name":"Yunnan Key Laboratory of Artificial Intelligence, Kunming University of Science and Technology, Kunming 650500, China"}]},{"given":"Zhengtao","family":"Yu","sequence":"additional","affiliation":[{"name":"Faculty of Information Engineering and Automation, Kunming University of Science and Technology, Kunming 650500, China"},{"name":"Yunnan Key Laboratory of Artificial Intelligence, Kunming University of Science and Technology, Kunming 650500, China"}]}],"member":"1968","published-online":{"date-parts":[[2023,2,6]]},"reference":[{"key":"ref_1","unstructured":"Veli\u010dkovi\u0107, P., Cucurull, G., Casanova, A., Romero, A., Lio, P., and Bengio, Y. (2017). Graph attention networks. arXiv."},{"key":"ref_2","doi-asserted-by":"crossref","first-page":"297","DOI":"10.1162\/089120105774321091","article-title":"Sentence fusion for multidocument news summarization","volume":"31","author":"Barzilay","year":"2005","journal-title":"Comput. Linguist."},{"key":"ref_3","doi-asserted-by":"crossref","unstructured":"Filippova, K., and Strube, M. (2008, January 25\u201327). Sentence fusion via dependency graph compression. Proceedings of the 2008 Conference on Empirical Methods in Natural Language Processing, Honolulu, HI, USA.","DOI":"10.3115\/1613715.1613741"},{"key":"ref_4","unstructured":"Banerjee, S., Mitra, P., and Sugiyama, K. (2015, January 25\u201331). Multi-document abstractive summarization using ilp based multi-sentence compression. Proceedings of the Twenty-Fourth International Joint Conference on Artificial Intelligence, Buenos Aires, Argentina."},{"key":"ref_5","doi-asserted-by":"crossref","unstructured":"Li, W. (2015, January 17\u201321). Abstractive multi-document summarization with semantic information extraction. Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, Lisbon, Portugal.","DOI":"10.18653\/v1\/D15-1219"},{"key":"ref_6","doi-asserted-by":"crossref","unstructured":"Bing, L., Li, P., Liao, Y., Lam, W., Guo, W., and Passonneau, R.J. (2015). Abstractive multi-document summarization via phrase selection and merging. arXiv.","DOI":"10.3115\/v1\/P15-1153"},{"key":"ref_7","doi-asserted-by":"crossref","first-page":"637","DOI":"10.1613\/jair.2655","article-title":"Sentence compression as tree transduction","volume":"34","author":"Cohn","year":"2009","journal-title":"J. Artif. Intell. Res."},{"key":"ref_8","unstructured":"Wang, L., and Cardie, C. (2013, January 4\u20139). Domain-independent abstract generation for focused meeting summarization. Proceedings of the 51st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Sofia, Bulgaria."},{"key":"ref_9","doi-asserted-by":"crossref","unstructured":"Pighin, D., Cornolti, M., Alfonseca, E., and Filippova, K. (2014, January 22\u201327). Modelling events through memory-based, open-ie patterns for abstractive summarization. Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Baltimore, MD, USA.","DOI":"10.3115\/v1\/P14-1084"},{"key":"ref_10","unstructured":"Paulus, R., Xiong, C., and Socher, R. (2017). A deep reinforced model for abstractive summarization. arXiv."},{"key":"ref_11","doi-asserted-by":"crossref","unstructured":"Gehrmann, S., Deng, Y., and Rush, A.M. (2018). Bottom-up abstractive summarization. arXiv.","DOI":"10.18653\/v1\/D18-1443"},{"key":"ref_12","doi-asserted-by":"crossref","unstructured":"Li, W., Xiao, X., Lyu, Y., and Wang, Y. (November, January 31). Improving neural abstractive document summarization with structural regularization. Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, Brussels, Belgium.","DOI":"10.18653\/v1\/D18-1441"},{"key":"ref_13","doi-asserted-by":"crossref","unstructured":"Zhang, X., Wei, F., and Zhou, M. (2019). HIBERT: Document level pre-training of hierarchical bidirectional transformers for document summarization. arXiv.","DOI":"10.18653\/v1\/P19-1499"},{"key":"ref_14","first-page":"1","article-title":"Exploring the limits of transfer learning with a unified text-to-text transformer","volume":"21","author":"Raffel","year":"2020","journal-title":"Mach. Learn. Res."},{"key":"ref_15","doi-asserted-by":"crossref","unstructured":"Lewis, M., Liu, Y., Goyal, N., Ghazvininejad, M., Mohamed, A., Levy, O., Stoyanov, V., and Zettlemoyer, L. (2019). BART: Denoising Sequence-to-Sequence Pre-training for Natural Language Generation, Translation, and Comprehension. arXiv.","DOI":"10.18653\/v1\/2020.acl-main.703"},{"key":"ref_16","unstructured":"Zhang, J., Zhao, Y., Saleh, M., and Liu, P. Pegasus: Pre-training with extracted gap-sentences for abstractive summarization. Proceedings of the International Conference on Machine Learning. PMLR, Available online: https:\/\/www.google.com.hk\/url?sa=t&rct=j&q=&esrc=s&source=web&cd=&ved=2ahUKEwiiwNjb8P_8AhXD6aQKHdhzB3sQFnoECAoQAQ&url=http%3A%2F%2Fproceedings.mlr.press%2Fv119%2Fzhang20ae%2Fzhang20ae.pdf&usg=AOvVaw1VKn6wia_Muv_rcuqG30sp."},{"key":"ref_17","doi-asserted-by":"crossref","unstructured":"Zou, Y., Zhang, X., Lu, W., Wei, F., and Zhou, M. (2020). Pre-training for abstractive document summarization by reinstating source text. arXiv.","DOI":"10.18653\/v1\/2020.emnlp-main.297"},{"key":"ref_18","doi-asserted-by":"crossref","unstructured":"Grail, Q., Perez, J., and Gaussier, E. (2021, January 19\u201323). Globalizing BERT-based transformer architectures for long document summarization. Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume, Online.","DOI":"10.18653\/v1\/2021.eacl-main.154"},{"key":"ref_19","doi-asserted-by":"crossref","first-page":"457","DOI":"10.1613\/jair.1523","article-title":"Lexrank: Graph-based lexical centrality as salience in text summarization","volume":"22","author":"Erkan","year":"2004","journal-title":"J. Artif. Intell. Res."},{"key":"ref_20","doi-asserted-by":"crossref","unstructured":"Wan, X. (2008, January 25\u201327). An exploration of document impact on graph-based multi-document summarization. Proceedings of the 2008 Conference on Empirical Methods in Natural Language Processing, Honolulu, HI, USA.","DOI":"10.3115\/1613715.1613811"},{"key":"ref_21","unstructured":"Christensen, J., Soderland, S., and Etzioni, O. (2013, January 9\u201314). Towards coherent multi-document summarization. Proceedings of the 2013 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Atlanta, GA, USA."},{"key":"ref_22","unstructured":"Tan, J., Wan, X., and Xiao, J. (August, January 30). Abstractive document summarization with a graph-based attentional neural model. Proceedings of the 55th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Vancouver, BC, Canada."},{"key":"ref_23","doi-asserted-by":"crossref","unstructured":"Yasunaga, M., Zhang, R., Meelu, K., Pareek, A., Srinivasan, K., and Radev, D. (2017). Graph-based neural multi-document summarization. arXiv.","DOI":"10.18653\/v1\/K17-1045"},{"key":"ref_24","doi-asserted-by":"crossref","unstructured":"Fan, A., Gardent, C., Braud, C., and Bordes, A. (2019). Using local knowledge graph construction to scale seq2seq models to multi-document inputs. arXiv.","DOI":"10.18653\/v1\/D19-1428"},{"key":"ref_25","doi-asserted-by":"crossref","unstructured":"Huang, L., Wu, L., and Wang, L. (2020). Knowledge graph-augmented abstractive summarization with semantic-driven cloze reward. arXiv.","DOI":"10.18653\/v1\/2020.acl-main.457"},{"key":"ref_26","doi-asserted-by":"crossref","unstructured":"Wang, D., Liu, P., Zheng, Y., Qiu, X., and Huang, X. (2020). Heterogeneous graph neural networks for extractive document summarization. arXiv.","DOI":"10.18653\/v1\/2020.acl-main.553"},{"key":"ref_27","doi-asserted-by":"crossref","unstructured":"Song, Z., and King, I. (2022, January 7\u201314). Hierarchical Heterogeneous Graph Attention Network for Syntax-Aware Summarization. Proceedings of the AAAI Conference on Artificial Intelligence, Washington, DC, USA.","DOI":"10.1609\/aaai.v36i10.21385"},{"key":"ref_28","doi-asserted-by":"crossref","unstructured":"Jin, H., Wang, T., and Wan, X. (2020, January 5\u201310). Multi-granularity interaction network for extractive and abstractive multi-document summarization. Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, Online.","DOI":"10.18653\/v1\/2020.acl-main.556"},{"key":"ref_29","doi-asserted-by":"crossref","unstructured":"Liu, Y., and Lapata, M. (2019). Hierarchical transformers for multi-document summarization. arXiv.","DOI":"10.18653\/v1\/P19-1500"},{"key":"ref_30","first-page":"9","article-title":"Language models are unsupervised multitask learners","volume":"1","author":"Radford","year":"2019","journal-title":"OpenAI blog"},{"key":"ref_31","unstructured":"Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A.N., Kaiser, \u0141., and Polosukhin, I. (2017). Attention is all you need. Adv. Neural Inf. Process. Syst., 30."},{"key":"ref_32","unstructured":"Lin, C.Y. Rouge: A package for automatic evaluation of summaries. Proceedings of the Text summarization branches out, Available online: https:\/\/aclanthology.org\/W04-1013.pdf."},{"key":"ref_33","unstructured":"Liu, P.J., Saleh, M., Pot, E., Goodrich, B., Sepassi, R., Kaiser, L., and Shazeer, N. (2018). Generating wikipedia by summarizing long sequences. arXiv."},{"key":"ref_34","doi-asserted-by":"crossref","unstructured":"Li, W., Xiao, X., Liu, J., Wu, H., Wang, H., and Du, J. (2020). Leveraging graph to improve abstractive multi-document summarization. arXiv.","DOI":"10.18653\/v1\/2020.acl-main.555"}],"container-title":["Information"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/2078-2489\/14\/2\/97\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,10]],"date-time":"2025-10-10T18:25:16Z","timestamp":1760120716000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/2078-2489\/14\/2\/97"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2023,2,6]]},"references-count":34,"journal-issue":{"issue":"2","published-online":{"date-parts":[[2023,2]]}},"alternative-id":["info14020097"],"URL":"https:\/\/doi.org\/10.3390\/info14020097","relation":{},"ISSN":["2078-2489"],"issn-type":[{"type":"electronic","value":"2078-2489"}],"subject":[],"published":{"date-parts":[[2023,2,6]]}}}