{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,11,1]],"date-time":"2025-11-01T09:36:42Z","timestamp":1761989802430,"version":"build-2065373602"},"reference-count":59,"publisher":"MDPI AG","issue":"4","license":[{"start":{"date-parts":[[2025,3,24]],"date-time":"2025-03-24T00:00:00Z","timestamp":1742774400000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0\/"}],"funder":[{"name":"Key R&amp;D Plans of Yunnan Province","award":["202203AA080004","2020AAA0108004"],"award-info":[{"award-number":["202203AA080004","2020AAA0108004"]}]},{"name":"National Key R&amp;D Plan","award":["202203AA080004","2020AAA0108004"],"award-info":[{"award-number":["202203AA080004","2020AAA0108004"]}]}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["Symmetry"],"abstract":"<jats:p>Semantic Textual Similarity (STS) serves as a metric for evaluating the semantic symmetry between texts, playing a pivotal role in various natural language processing (NLP) tasks. To facilitate the accurate measurement of semantic symmetry, high-quality text representation is essential. This paper studies how to utilize constituent parsing for text representation in STS. Unlike most existing syntax models, we propose a heterogeneous graph attention network that integrates constituent parsing (HGAT-CP). The heterogeneous graph contains meaningfully connected sentences, verb phrase (VP), noun phrase (NP), phrase, and word nodes, which are derived from the constituent parsing tree. This graph is fed to a graph attention network for context propagation among relevant nodes, which effectively captures the relations of inter-sentence components. In addition, we leverage the relationships between verb phrases (VPs) and noun phrases (NPs) across sentence pairs for data augmentation, which is denoted as HGAT_CP(NP, VP). We extensively evaluate our method on three datasets, and experimental results demonstrate that our proposed HGAT_CP(NP, VP) achieves significant improvements on the majority of the datasets. Notably, on the SICK dataset, HGAT_CP(NP, VP) achieved improvements of 0.39 and 1.84 compared to SimCSE-ROBERTa_large and SimCSE-ROBERTa_base, respectively.<\/jats:p>","DOI":"10.3390\/sym17040486","type":"journal-article","created":{"date-parts":[[2025,3,24]],"date-time":"2025-03-24T13:48:20Z","timestamp":1742824100000},"page":"486","update-policy":"https:\/\/doi.org\/10.3390\/mdpi_crossmark_policy","source":"Crossref","is-referenced-by-count":2,"title":["Semantic Textual Similarity with Constituent Parsing Heterogeneous Graph Attention Networks"],"prefix":"10.3390","volume":"17","author":[{"ORCID":"https:\/\/orcid.org\/0009-0009-2999-417X","authenticated-orcid":false,"given":"Hao","family":"Wu","sequence":"first","affiliation":[{"name":"School of Computer Science and Technology, Dalian University of Technology, Dalian 116000, China"}]},{"given":"Degen","family":"Huang","sequence":"additional","affiliation":[{"name":"School of Computer Science and Technology, Dalian University of Technology, Dalian 116000, China"}]},{"given":"Xiaohui","family":"Lin","sequence":"additional","affiliation":[{"name":"School of Computer Science and Technology, Dalian University of Technology, Dalian 116000, China"}]}],"member":"1968","published-online":{"date-parts":[[2025,3,24]]},"reference":[{"key":"ref_1","unstructured":"Wang, M., Smith, N.A., and Mitamura, T. (2007, January 28\u201330). What is The Jeopardy Model? A Quasi-Synchronous Grammar for QA. Proceedings of the 2007 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning, Prague, Czech Republic."},{"key":"ref_2","doi-asserted-by":"crossref","unstructured":"Yang, Y., Yih, W.T., and Meek, C. (2015, January 17\u201321). WikiQA: A Challenge Dataset for Open-Domain Question Answering. Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, Lisbon, Portugal.","DOI":"10.18653\/v1\/D15-1237"},{"key":"ref_3","doi-asserted-by":"crossref","unstructured":"Santos, J., Alves, A., and Gon\u00e7alo Oliveira, H. (2020, January 2\u20134). Leveraging on Semantic Textual Similarity for Developing a Portuguese Dialogue System. Proceedings of the International Conference on Computational Processing of the Portuguese Language, Evora, Portugal.","DOI":"10.1007\/978-3-030-41505-1_13"},{"key":"ref_4","doi-asserted-by":"crossref","unstructured":"Yin, W., and Sch\u00fctze, H. (June, January 31). Convolutional Neural Network for Paraphrase Identification. Proceedings of the Human Language Technologies: The 2015 Annual Conference of the North American Chapter of the ACL, Denver, CO, USA.","DOI":"10.3115\/v1\/N15-1091"},{"key":"ref_5","doi-asserted-by":"crossref","unstructured":"He, H., Wieting, J., Gimpel, K., Rao, J., and Lin, J. (2016, January 16\u201317). UMD-TTIC-UW at SemEval-2016 Task 1: Attention-Based Multi-Perspective Convolutional Neural Networks for Textual Similarity Measurement. Proceedings of the SemEval-2016, San Diego, CA, USA.","DOI":"10.18653\/v1\/S16-1170"},{"key":"ref_6","unstructured":"Richardson, R., and Smeaton, A.F. (1995). Using Wordnet in a Knowledge-Based Approach to Information Retrieval, Dublin City University, School of Computer Applications."},{"key":"ref_7","unstructured":"Niwattanakul, S., Singthongchai, J., Naenudorn, E., and Wanapu, S. (2013, January 13\u201315). Using of Jaccard Coefficient for Keywords Similarity. Proceedings of the International Multiconference of Engineers and Computer Scientists, Hong Kong, China."},{"key":"ref_8","doi-asserted-by":"crossref","first-page":"1425","DOI":"10.1162\/tacl_a_00435","article-title":"Weisfeiler-Leman in the BAMBOO: Novel AMR Graph Metrics and a Benchmark for AMR Graph Similarity","volume":"9","author":"Opitz","year":"2021","journal-title":"Trans. Assoc. Comput. Linguist."},{"key":"ref_9","doi-asserted-by":"crossref","unstructured":"Wang, H., and Yu, D. (2023, January 9\u201314). Going Beyond Sentence Embeddings: A Token-Level Matching Algorithm for Calculating Semantic Textual Similarity. Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics, Toronto, ON, Canada.","DOI":"10.18653\/v1\/2023.acl-short.49"},{"key":"ref_10","doi-asserted-by":"crossref","unstructured":"Pagliardini, M., Gupta, P., and Jaggi, M. (2018, January 1\u20136). Unsupervised Learning of Sentence Embeddings using Compositional N-gram Features. Proceedings of the NAACL-HLT 2018, New Orleans, LA, USA.","DOI":"10.18653\/v1\/N18-1049"},{"key":"ref_11","unstructured":"Le, Q., and Mikolov, T. (2014, January 21\u201326). Distributed Representations of Sentences and Documents. Proceedings of the 31st International Conference on International Conference on Machine Learning, Volume 32 (ICML\u201914), Beijing, China."},{"key":"ref_12","doi-asserted-by":"crossref","unstructured":"He, H., Gimpel, K., and Lin, J. (2015, January 17\u201321). Multi-Perspective Sentence Similarity Modeling with Convolutional Neural Network. Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, Lisbon, Portugal.","DOI":"10.18653\/v1\/D15-1181"},{"key":"ref_13","doi-asserted-by":"crossref","unstructured":"Mueller, J., and Thyagarajan, A. (2016, January 12\u201317). Siamese Recurrent Architectures for Learning Sentence Similarity. Proceedings of the Thirtieth AAAI Conference on Artificial Intelligence (AAAI\u201916), Phoenix, AZ, USA.","DOI":"10.1609\/aaai.v30i1.10350"},{"key":"ref_14","unstructured":"Ranasinghe, T., Or\u01cesan, C., and Mitkov, R. (2019, January 2\u20134). Semantic Textual Similarity with Siamese Neural Networks. Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2019), Varna, Bulgaria."},{"key":"ref_15","first-page":"22","article-title":"Wordnet: A Lexical Database for English","volume":"38","author":"Miller","year":"1992","journal-title":"Commun. ACM"},{"key":"ref_16","unstructured":"Devlin, J., Chang, M.W., Lee, K., and Toutanova, K. (2019, January 2\u20137). BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding. Proceedings of the NAACL-HLT 2019, Minneapolis, MN, USA."},{"key":"ref_17","first-page":"5998","article-title":"Attention is All you Need","volume":"30","author":"Vaswani","year":"2017","journal-title":"Adv. Neural Inf. Process. Syst."},{"key":"ref_18","unstructured":"Liu, Y., Ott, M., Goyal, N., Du, J., Joshi, M., Chen, D., Levy, O., Lewis, M., Zettlemoyer, L., and Stoyanov, V. (2019). RoBERTa: A Robustly Optimized BERT Pretraining Approach. arXiv."},{"key":"ref_19","doi-asserted-by":"crossref","unstructured":"Reimers, N., and Gurevych, I. (2019, January 3\u20137). Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks. Proceedings of the EMNLP 2019, Hong Kong, China.","DOI":"10.18653\/v1\/D19-1410"},{"key":"ref_20","doi-asserted-by":"crossref","unstructured":"Gao, T., Yao, X., and Chen, D. (2021, January 7\u201311). Simcse: Simple Contrastive Learning of Sentence Embeddings. Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, Punta Cana, Dominican Republic.","DOI":"10.18653\/v1\/2021.emnlp-main.552"},{"key":"ref_21","doi-asserted-by":"crossref","unstructured":"Chuang, Y.S., Dangovski, R., Luo, H., Zhang, Y., Chang, S., Solja\u010di\u0107, M., Li, S.W., Yih, W.T., Kim, Y., and Glass, J. (2022, January 10\u201315). Diffcse: Difference-Based Contrastive Learning for Sentence Embeddings. Proceedings of the NAACL 2022: Human Language Technologies, Seattle, WA, USA.","DOI":"10.18653\/v1\/2022.naacl-main.311"},{"key":"ref_22","doi-asserted-by":"crossref","unstructured":"Zhang, D., Xiao, W., Zhu, H., Ma, X., and Arnold, A.O. (2022, January 22\u201327). Virtual Augmentation Supported Contrastive Learning of Sentence Representations. Proceedings of the Findings of the Association for Computational Linguistics: ACL 2022, Dublin, Ireland.","DOI":"10.18653\/v1\/2022.findings-acl.70"},{"key":"ref_23","unstructured":"Nguyen, X.P., Joty, S., Hoi, S.C., and Socher, R. (2019). Tree-Structured Attention with Hierarchical Accumulation. arXiv."},{"key":"ref_24","first-page":"9636","article-title":"Sg-net: Syntax-Guided Machine Reading Comprehension","volume":"34","author":"Zhang","year":"2020","journal-title":"Aaai Conf. Artif. Intell."},{"key":"ref_25","doi-asserted-by":"crossref","unstructured":"Socher, R., Perelygin, A., Wu, J., Chuang, J., Manning, C.D., Ng, A.Y., and Potts, C. (2013, January 18\u201321). Recursive Deep Models for Semantic Compositionality over a Sentiment Treebank. Proceedings of the 2013 Conference on Empirical Methods in Natural Language Processing, Seattle, WA, USA.","DOI":"10.18653\/v1\/D13-1170"},{"key":"ref_26","doi-asserted-by":"crossref","unstructured":"Tai, K.S., Socher, R., and Manning, C.D. (2015, January 26\u201331). Improved Semantic Representations from Tree-Structured Long Short-Term Memory Networks. Proceedings of the 53rd Annual Meeting of the Association for Computational Linguistics, Beijing, China.","DOI":"10.3115\/v1\/P15-1150"},{"key":"ref_27","doi-asserted-by":"crossref","unstructured":"Li, Z., Zhou, Q., Li, C., Xu, K., and Cao, Y. (2021, January 1\u20136). Improving BERT with Syntax-aware Local Attention. Proceedings of the Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021, Online.","DOI":"10.18653\/v1\/2021.findings-acl.57"},{"key":"ref_28","doi-asserted-by":"crossref","unstructured":"Xu, Z., Guo, D., Tang, D., Su, Q., Shou, L., Gong, M., Zhong, W., Quan, X., Duan, N., and Jiang, D. (2021, January 1\u20136). Syntax-Enhanced Pre-trained Model. Proceedings of the ACL 2021, Online.","DOI":"10.18653\/v1\/2021.acl-long.420"},{"key":"ref_29","doi-asserted-by":"crossref","unstructured":"Bai, J., Wang, Y., Chen, Y., Yang, Y., Bai, J., Yu, J., and Tong, Y. (2021). Syntax-BERT: Improving Pre-trained Transformers with Syntax Trees. arXiv.","DOI":"10.18653\/v1\/2021.eacl-main.262"},{"key":"ref_30","doi-asserted-by":"crossref","unstructured":"Wang, R., Tang, D., Duan, N., Wei, Z., Huang, X., Cao, G., Jiang, D., and Zhou, M. (2021, January 1\u20136). K-adapter: Infusing Knowledge into Pre-trained Models with Adapters. Proceedings of the Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021, Online.","DOI":"10.18653\/v1\/2021.findings-acl.121"},{"key":"ref_31","doi-asserted-by":"crossref","unstructured":"Liang, S., Wei, W., Mao, X.L., Wang, F., and He, Z. (2022, January 22\u201327). BiSyn-GAT+: Bi-Syntax Aware Graph Attention Network for Aspect-based Sentiment Analysis. Proceedings of the Findings of the Association for Computational Linguistics: ACL 2022, Dublin, Ireland.","DOI":"10.18653\/v1\/2022.findings-acl.144"},{"key":"ref_32","first-page":"12462","article-title":"Gate: Graph Attention Transformer Encoder for Crosslingual Relation and Event Extraction","volume":"35","author":"Ahmad","year":"2021","journal-title":"Proc. AAAI Conf. Artif. Intell."},{"key":"ref_33","doi-asserted-by":"crossref","unstructured":"Devianti, R., and Miyao, Y. (2024, January 12\u201316). Transferability of Syntax-Aware Graph Neural Networks in Zero-Shot Cross-Lingual Semantic Role Labeling. Proceedings of the Findings of the Association for Computational Linguistics: EMNLP 2024, Miami, FL, USA.","DOI":"10.18653\/v1\/2024.findings-emnlp.2"},{"key":"ref_34","doi-asserted-by":"crossref","unstructured":"Zhang, P., Chen, J., Shen, J., Zhai, Z., Li, P., Zhang, J., and Zhang, K. (2024, January 12\u201316). Message Passing on Semantic-Anchor-Graphs for Fine-grained Emotion Representation Learning and Classification. Proceedings of the 2024 Confere.nce on Empirical Methods in Natural Language Processing, Miami, FL, USA.","DOI":"10.18653\/v1\/2024.emnlp-main.162"},{"key":"ref_35","doi-asserted-by":"crossref","unstructured":"Xu, H., Bao, J., and Liu, W. (2023, January 9\u201314). Double-Branch Multi-Attention based Graph Neural Network for Knowledge Graph Completion. Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Toronto, ON, Canada.","DOI":"10.18653\/v1\/2023.acl-long.850"},{"key":"ref_36","doi-asserted-by":"crossref","unstructured":"Zhang, D., Chen, F., and Chen, X. (2023, January 9\u201314). DualGATs: Dual Graph Attention Networks for Emotion Recognition in Conversations. Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), Toronto, ON, Canada.","DOI":"10.18653\/v1\/2023.acl-long.408"},{"key":"ref_37","unstructured":"Wang, X., Ji, H., Shi, C., Wang, B., Ye, Y., Cui, P., and Yu, P.S. (2015, January 13\u201317). Heterogeneous Graph Attention Network. Proceedings of the World Wide Web Conference, San Francisco, CA, USA."},{"key":"ref_38","doi-asserted-by":"crossref","first-page":"793","DOI":"10.1007\/s12559-023-10110-1","article-title":"Dialogue Relation Extraction with Document-Level Heterogeneous Graph Attention Nnetworks","volume":"15","author":"Chen","year":"2023","journal-title":"Cogn. Comput."},{"key":"ref_39","first-page":"4821","article-title":"HGAT: Heterogeneous Graph Attention Networks for Semi-Supervised Short Text Classification","volume":"39","author":"Linmei","year":"2021","journal-title":"Acm Trans. Inf. Syst. (Tois)"},{"key":"ref_40","doi-asserted-by":"crossref","first-page":"184","DOI":"10.5715\/jnlp.30.184","article-title":"Joint Learning-based Heterogeneous Graph Attention Network for Timeline Summarization","volume":"30","author":"You","year":"2023","journal-title":"J. Nat. Lang. Process."},{"key":"ref_41","doi-asserted-by":"crossref","unstructured":"Chen, S., Feng, S., Liang, S., Zong, C.C., Li, J., and Li, P. (2024, January 11\u201316). CACL: Community-Aware Heterogeneous Graph Contrastive Learning for Social Media Bot Detection. Proceedings of the Findings of the Association for Computational Linguistics: ACL 2024, Bangkok, Thailand.","DOI":"10.18653\/v1\/2024.findings-acl.617"},{"key":"ref_42","first-page":"1","article-title":"A Multisource Data Fusion-based Heterogeneous Graph Attention Network for Competitor Prediction","volume":"18","author":"Ye","year":"2024","journal-title":"Acm Trans. Knowl. Discov. Data"},{"key":"ref_43","doi-asserted-by":"crossref","unstructured":"Yang, Y., Tong, Y., Ma, S., and Deng, Z.H. (2016, January 1\u20135). A Position Encoding Convolutional Neural Network based on Dependency Tree for Relation Classification. Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing, Austin, TX, USA.","DOI":"10.18653\/v1\/D16-1007"},{"key":"ref_44","doi-asserted-by":"crossref","unstructured":"Xu, Y., Mou, L., Li, G., Chen, Y., Peng, H., and Jin, Z. (2015, January 17\u201321). Classifying Relations via Long Short Term Memory Networks Along Shortest Dependency Paths. Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing, Lisbon, Portugal.","DOI":"10.18653\/v1\/D15-1206"},{"key":"ref_45","doi-asserted-by":"crossref","unstructured":"Jiang, X., Li, Z., Zhang, B., Zhang, M., Li, S., and Si, L. (2018, January 15\u201320). Supervised Treebank Conversion: Data and Approaches. Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Long Papers), Melbourne, Australia.","DOI":"10.18653\/v1\/P18-1252"},{"key":"ref_46","unstructured":"Shen, Y., Lin, Z., Huang, C.W., and Courville, A. (May, January 30). Neural Language Modeling by Jointly Learning syntax and Lexicon. Proceedings of the International Conference on Learning Representations, Vancouver, BC, Canada."},{"key":"ref_47","unstructured":"Shen, Y., Tan, S., Sordoni, A., and Courville, A. (2019, January 6\u20139). Ordered Neurons: Integrating Tree Structures into Recurrent Neural Networks. Proceedings of the International Conference on Learning Representations, New Orleans, LA, USA."},{"key":"ref_48","doi-asserted-by":"crossref","unstructured":"Sachan, D.S., Zhang, Y., Qi, P., and Hamilton, W. (2021, January 19\u201323). Do Syntax Trees Help Pre-trained Transformers Extract Information?. Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics, Online.","DOI":"10.18653\/v1\/2021.eacl-main.228"},{"key":"ref_49","doi-asserted-by":"crossref","unstructured":"Bao, X., Jiang, X., Wang, Z., Zhang, Y., and Zhou, G. (2023, January 9\u201314). Opinion Tree Parsing for Aspect-based Sentiment Analysis. Proceedings of the Findings of the Association for Computational Linguistics: ACL 2023, Toronto, ON, Canada.","DOI":"10.18653\/v1\/2023.findings-acl.505"},{"key":"ref_50","doi-asserted-by":"crossref","first-page":"baac070","DOI":"10.1093\/database\/baac070","article-title":"Do Syntactic Trees Enhance Bidirectional Encoder Representations from Transformers (BERT) models for chemical\u2013drug relation extraction?","volume":"2022","author":"Tang","year":"2022","journal-title":"Database"},{"key":"ref_51","unstructured":"Velickovic, P., Cucurull, G., Casanova, A., Romero, A., Lio, P., and Bengio, Y. (2017). Graph Attention Networks. arXiv."},{"key":"ref_52","doi-asserted-by":"crossref","first-page":"79","DOI":"10.1214\/aoms\/1177729694","article-title":"On Information and Sufficiency","volume":"22","author":"Kullback","year":"1951","journal-title":"Ann. Math. Stat."},{"key":"ref_53","doi-asserted-by":"crossref","unstructured":"Marelli, M., Bentivogli, L., Baroni, M., Bernardi, R., Menini, S., and Zamparelli, R. (2014, January 23\u201324). SemEval-2014 task 1: Evaluation of Compositional Distributional Semantic Models on Full Sentences Through Semantic Relatedness and Textual Entailment. Proceedings of the 8th International Workshop on Semantic Evaluation, Dublin, Ireland.","DOI":"10.3115\/v1\/S14-2001"},{"key":"ref_54","unstructured":"Agirre, E., Cer, D., Diab, M., and Gonzalez-Agirre, A. (2012, January 7\u20138). SemEval-2012 task 6: A Pilot on Semantic Textual Similarity. Proceedings of the First Joint Conference on Lexical and Computational Semantics, Montreal, QC, Canada."},{"key":"ref_55","doi-asserted-by":"crossref","first-page":"207","DOI":"10.1162\/tacl_a_00177","article-title":"Grounded Compositional Semantics for Finding and Describing Images with Sentences","volume":"2","author":"Socher","year":"2014","journal-title":"Trans. Assoc. Comput. Linguist."},{"key":"ref_56","doi-asserted-by":"crossref","unstructured":"Shao, Y. (2017, January 3\u20134). Hcti at semeval-2017 task 1: Use Convolutional Neural Network to Evaluate Semantic Textual Similarity. Proceedings of the 11th International Workshop on Semantic Evaluation (SemEval-2017), Vancouver, BC, Canada.","DOI":"10.18653\/v1\/S17-2016"},{"key":"ref_57","doi-asserted-by":"crossref","unstructured":"Yang, Y., Yuan, S., Cer, D., Kong, S.Y., Constant, N., Pilar, P., Ge, H., Sung, Y.H., Strope, B., and Kurzweil, R. (2018, January 20). Learning Semantic Textual Similarity from Conversations. Proceedings of the 3rd Workshop on Representation Learning for NLP, Melbourne, Australia.","DOI":"10.18653\/v1\/W18-3022"},{"key":"ref_58","doi-asserted-by":"crossref","first-page":"433","DOI":"10.1017\/S1351324919000184","article-title":"Jointly Learning Sentence Embeddings and Syntax with Unsupervised Tree-LSTMs","volume":"25","author":"Maillard","year":"2019","journal-title":"Nat. Lang. Eng."},{"key":"ref_59","doi-asserted-by":"crossref","unstructured":"Choi, J., Yoo, K.M., and Lee, S.G. (2018, January 2\u20137). Learning to Compose Task-Specific Tree Structures. Proceedings of the Thirty-Second AAAI Conference on Artificial Intelligence: Thirtieth Innovative Applications of Artificial Intelligence Conference and Eighth Symposium on Educational Advances in Artificial Intelligence, New Orleans, LA, USA.","DOI":"10.1609\/aaai.v32i1.11975"}],"container-title":["Symmetry"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/www.mdpi.com\/2073-8994\/17\/4\/486\/pdf","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,10,9]],"date-time":"2025-10-09T16:59:22Z","timestamp":1760029162000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.mdpi.com\/2073-8994\/17\/4\/486"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2025,3,24]]},"references-count":59,"journal-issue":{"issue":"4","published-online":{"date-parts":[[2025,4]]}},"alternative-id":["sym17040486"],"URL":"https:\/\/doi.org\/10.3390\/sym17040486","relation":{},"ISSN":["2073-8994"],"issn-type":[{"type":"electronic","value":"2073-8994"}],"subject":[],"published":{"date-parts":[[2025,3,24]]}}}