{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,5,1]],"date-time":"2026-05-01T04:47:24Z","timestamp":1777610844150,"version":"3.51.4"},"reference-count":35,"publisher":"MDPI AG","issue":"8","license":[{"start":{"date-parts":[[2021,8,9]],"date-time":"2021-08-09T00:00:00Z","timestamp":1628467200000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Symmetry"],"abstract":"<jats:p>The task to extract relations tries to identify relationships between two named entities in a sentence. Because a sentence usually contains several named entities, capturing structural information of a sentence is important to support this task. Currently, graph neural networks are widely implemented to support relation extraction, in which dependency trees are employed to generate adjacent matrices for encoding structural information of a sentence. Because parsing a sentence is error-prone, it influences the performance of a graph neural network. On the other hand, a sentence is structuralized by several named entities, which precisely segment a sentence into several parts. Different features can be combined by prior knowledge and experience, which are effective to initialize a symmetric adjacent matrix for a graph neural network. Based on this phenomenon, we proposed a feature combination-based graph convolutional neural network model (FC-GCN). It has the advantages of encoding structural information of a sentence, considering prior knowledge, and avoiding errors caused by parsing. In the experiments, the results show significant improvement, which outperform existing state-of-the-art performances.<\/jats:p>","DOI":"10.3390\/sym13081458","type":"journal-article","created":{"date-parts":[[2021,8,9]],"date-time":"2021-08-09T21:41:46Z","timestamp":1628545306000},"page":"1458","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":15,"title":["A Feature Combination-Based Graph Convolutional Neural Network Model for Relation Extraction"],"prefix":"10.3390","volume":"13","author":[{"given":"Jinling","family":"Xu","sequence":"first","affiliation":[{"name":"College of Computer Science and Technology, Guizhou University, Guiyang 550025, China"}]},{"given":"Yanping","family":"Chen","sequence":"additional","affiliation":[{"name":"College of Computer Science and Technology, Guizhou University, Guiyang 550025, China"}]},{"given":"Yongbin","family":"Qin","sequence":"additional","affiliation":[{"name":"College of Computer Science and Technology, Guizhou University, Guiyang 550025, China"}]},{"given":"Ruizhang","family":"Huang","sequence":"additional","affiliation":[{"name":"College of Computer Science and Technology, Guizhou University, Guiyang 550025, China"}]},{"given":"Qinghua","family":"Zheng","sequence":"additional","affiliation":[{"name":"School of Automation Science and Engineering, Xi\u2019an Jiaotong University, Xi\u2019an 710049, China"}]}],"member":"1968","published-online":{"date-parts":[[2021,8,9]]},"reference":[{"key":"ref_1","doi-asserted-by":"crossref","unstructured":"Zhang, Y., Zhong, V., Chen, D., Angeli, G., and Manning, C.D. (2017, January 7\u201311). Position-aware attention and supervised data improve slot filling. Proceedings of the 2017 Conference on Empirical Methods in Natural Language Processing, Copenhagen, Denmark.","DOI":"10.18653\/v1\/D17-1004"},{"key":"ref_2","doi-asserted-by":"crossref","unstructured":"He, R., Wang, J., Guo, F., and Han, Y. (2020, January 5\u201310). Transs-driven joint learning architecture for implicit discourse relation recognition. Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, Online.","DOI":"10.18653\/v1\/2020.acl-main.14"},{"key":"ref_3","first-page":"8928","article-title":"Relation extraction with convolutional network over learnable syntax-transport graph","volume":"34","author":"Sun","year":"2020","journal-title":"Proc. AAAI Conf. Artif."},{"key":"ref_4","doi-asserted-by":"crossref","first-page":"142515","DOI":"10.1109\/ACCESS.2019.2944559","article-title":"A set space model to capture structural information of a sentence","volume":"7","author":"Chen","year":"2019","journal-title":"IEEE Access"},{"key":"ref_5","unstructured":"Zeng, D., Liu, K., Lai, S., Zhou, G., and Zhao, J. (2014). Relation classification via convolutional deep neural network. Proceedings of the COLING 2014, the 25th International Conference on Computational Linguistics: Technical Papers, Dublin City University and Association for Computational Linguistics. Available online: https:\/\/www.aclweb.org\/anthology\/C14-1220."},{"key":"ref_6","first-page":"941","article-title":"Semantic relation classification via convolutional neural networks with simple negative sampling","volume":"71","author":"Xu","year":"2015","journal-title":"Comput. Sci."},{"key":"ref_7","doi-asserted-by":"crossref","unstructured":"Zeng, D., Liu, K., Chen, Y., and Zhao, J. (2015, January 17\u201321). Distant supervision for relation extraction via piecewise convolutional neural networks. Proceedings of the Conference on Empirical Methods in Natural Language Processing, Lisbon, Portugal.","DOI":"10.18653\/v1\/D15-1203"},{"key":"ref_8","unstructured":"Yan, X., Mou, L., Li, G., Chen, Y., and Jin, Z. (2015, January 17\u201321). Classifying relations via long short term memory networks along shortest dependency paths. Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing (EMNLP), Lisbon, Portugal."},{"key":"ref_9","first-page":"9620","article-title":"Distilling knowledge from well-informed soft labels for neural relation extraction","volume":"34","author":"Zhang","year":"2020","journal-title":"Proc. AAAI Conf. Artif."},{"key":"ref_10","unstructured":"Veyseh, A.P.B., Dernoncourt, F., Dou, D., and Nguyen, T.H. (2020, January 5\u201310). Exploiting the syntax-model consistency for neural relation extraction. Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, Online."},{"key":"ref_11","doi-asserted-by":"crossref","unstructured":"Alt, C., Gabryszak, A., and Hennig, L. (2020). Probing linguistic features of sentence-level representations in neural relation extraction. Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, Online, 5\u201310 July 2020, Association for Computational Linguistics. Available online: https:\/\/www.aclweb.org\/anthology\/2020.acl-main.140.","DOI":"10.18653\/v1\/2020.acl-main.140"},{"key":"ref_12","doi-asserted-by":"crossref","unstructured":"Yu, D., Sun, K., Cardie, C., and Yu, D. (2020, January 5\u201310). Dialogue-based relation extraction. Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, Online.","DOI":"10.18653\/v1\/2020.acl-main.444"},{"key":"ref_13","doi-asserted-by":"crossref","unstructured":"Zhou, W., Huang, K., Ma, T., and Huang, J. (2020). Document-level relation extraction with adaptive thresholding and localized context pooling. arXiv.","DOI":"10.1609\/aaai.v35i16.17717"},{"key":"ref_14","doi-asserted-by":"crossref","unstructured":"Jain, S., van Zuylen, M., Hajishirzi, H., and Beltagy, I. (2020). Scirex: A challenge dataset for document-level information extraction. arXiv.","DOI":"10.18653\/v1\/2020.acl-main.670"},{"key":"ref_15","doi-asserted-by":"crossref","unstructured":"Wei, Z., Su, J., Wang, Y., Tian, Y., and Chang, Y. (2020, January 5\u201310). A novel cascade binary tagging framework for relational triple extraction. Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, Online.","DOI":"10.18653\/v1\/2020.acl-main.136"},{"key":"ref_16","unstructured":"Shen, Y., and Huang, X.-J. (2016, January 11\u201316). Attention-based convolutional neural network for semantic relation extraction. Proceedings of the COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers, Osaka, Japan."},{"key":"ref_17","doi-asserted-by":"crossref","unstructured":"Guo, Z., Zhang, Y., and Lu, W. (2019). Attention guided graph convolutional networks for relation extraction. arXiv.","DOI":"10.18653\/v1\/P19-1024"},{"key":"ref_18","doi-asserted-by":"crossref","unstructured":"Vashishth, S., Joshi, R., Prayaga, S.S., Bhattacharyya, C., and Talukdar, P. (November, January 31). Reside: Improving distantly-supervised neural relation extraction using side information. Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, Brussels, Belgium.","DOI":"10.18653\/v1\/D18-1157"},{"key":"ref_19","doi-asserted-by":"crossref","unstructured":"Fu, T.J., and Ma, W.Y. (2019). Graphrel: Modeling text as relational graphs for joint entity and relation extraction. ACL, 1409\u20131418.","DOI":"10.18653\/v1\/P19-1136"},{"key":"ref_20","doi-asserted-by":"crossref","unstructured":"Zhang, Y., Qi, P., and Manning, C.D. (2018). Graph convolution over pruned dependency trees improves relation extraction. arXiv.","DOI":"10.18653\/v1\/D18-1244"},{"key":"ref_21","unstructured":"Vashishth, S., Sanyal, S., Nitin, V., and Talukdar, P.P. (2019). Composition-based multi-relational graph convolutional networks. arXiv."},{"key":"ref_22","unstructured":"Sun, C., Gong, Y., Wu, Y., Gong, M., and Duan, N. (August, January 28). Joint type inference on entities and relations via graph convolutional networks. Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, Florence, Italy."},{"key":"ref_23","doi-asserted-by":"crossref","unstructured":"Chen, Y., Zheng, Q., and Zhang, W. (2014, January 22\u201327). Omni-word feature and soft constraint for chinese relation extraction. Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Baltimore, MD, USA.","DOI":"10.3115\/v1\/P14-1054"},{"key":"ref_24","unstructured":"Kipf, T.N., and Welling, M. (2016). Semi-supervised classification with graph convolutional networks. arXiv."},{"key":"ref_25","unstructured":"Roth, D., and Yih, W.-T. (2004). A linear programming formulation for global inference in natural language tasks. Proceedings of the Eighth Conference on Computational Natural Language Learning (CoNLL-2004) at HLT-NAACL 2004, Boston, MA, USA, 6\u20137 May 2004, Association for Computational Linguistics. Available online: https:\/\/www.aclweb.org\/anthology\/W04-2401."},{"key":"ref_26","doi-asserted-by":"crossref","unstructured":"Luan, Y., He, L., Ostendorf, M., and Hajishirzi, H. (2018). Multi-task identification of entities, relations, and coreferencefor scientific knowledge graph construction. arXiv.","DOI":"10.18653\/v1\/D18-1360"},{"key":"ref_27","unstructured":"Devlin, J., Chang, M.W., Lee, K., and Toutanova, K. (2018). Bert: Pre-training of deep bidirectional transformers for language understanding. arXiv."},{"key":"ref_28","doi-asserted-by":"crossref","unstructured":"Kambhatla, N. (2004). Combining lexical, syntactic, and semantic features with maximum entropy models for extracting relations. Proceedings of the ACL 2004 on Interactive Poster and Demonstration Sessions, ACLdemo \u201904, Barcelona, Spain, 21\u201326 July 2004, Association for Computational Linguistics.","DOI":"10.3115\/1219044.1219066"},{"key":"ref_29","unstructured":"Zhou, G., Su, J., Zhang, J., and Zhang, M. (2005, January 25\u201330). Exploring various knowledge in relation extraction. Proceedings of the ACL 2005, 43rd Annual Meeting of the Association for Computational Linguistics, Ann Arbor, MI, USA."},{"key":"ref_30","doi-asserted-by":"crossref","unstructured":"Gormley, M.R., Yu, M., and Dredze, M. (2015). Improved relation extraction with feature-rich compositional embedding models. arXiv.","DOI":"10.18653\/v1\/D15-1205"},{"key":"ref_31","unstructured":"Veyseh, A.P.B., Nguyen, T.H., and Dou, D. (2019). Improving cross-domain performance for relation extraction via dependency prediction and information flow control. arXiv."},{"key":"ref_32","doi-asserted-by":"crossref","unstructured":"Wang, H., Tan, M., Yu, M., Chang, S., Wang, D., Xu, K., Guo, X., and Potdar, S. (2019). Extracting multiple-relations in one-pass with pre-trained transformers. arXiv.","DOI":"10.18653\/v1\/P19-1132"},{"key":"ref_33","doi-asserted-by":"crossref","unstructured":"Zhong, Z., and Chen, D. (2020). A frustratingly easy approach for joint entity and relation extraction. arXiv.","DOI":"10.18653\/v1\/2021.naacl-main.5"},{"key":"ref_34","doi-asserted-by":"crossref","first-page":"13195","DOI":"10.1109\/ACCESS.2020.2966303","article-title":"A multi-channel deep neural network for relation extraction","volume":"8","author":"Chen","year":"2020","journal-title":"IEEE Access"},{"key":"ref_35","doi-asserted-by":"crossref","first-page":"249","DOI":"10.1016\/j.neunet.2021.04.010","article-title":"A neuralized feature engineering method for entity relation extraction","volume":"141","author":"Chen","year":"2021","journal-title":"Neural Netw."}],"container-title":["Symmetry"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/2073-8994\/13\/8\/1458\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,11]],"date-time":"2025-10-11T06:43:08Z","timestamp":1760164988000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/2073-8994\/13\/8\/1458"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2021,8,9]]},"references-count":35,"journal-issue":{"issue":"8","published-online":{"date-parts":[[2021,8]]}},"alternative-id":["sym13081458"],"URL":"https:\/\/doi.org\/10.3390\/sym13081458","relation":{},"ISSN":["2073-8994"],"issn-type":[{"value":"2073-8994","type":"electronic"}],"subject":[],"published":{"date-parts":[[2021,8,9]]}}}