{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,1,23]],"date-time":"2026-01-23T10:37:07Z","timestamp":1769164627939,"version":"3.49.0"},"reference-count":81,"publisher":"Association for Computing Machinery (ACM)","issue":"5","license":[{"start":{"date-parts":[[2023,7,21]],"date-time":"2023-07-21T00:00:00Z","timestamp":1689897600000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/www.acm.org\/publications\/policies\/copyright_policy#Background"}],"funder":[{"name":"China NSF","award":["62072200, 62202146"],"award-info":[{"award-number":["62072200, 62202146"]}]}],"content-domain":{"domain":["dl.acm.org"],"crossmark-restriction":true},"short-container-title":["ACM Trans. Softw. Eng. Methodol."],"published-print":{"date-parts":[[2023,9,30]]},"abstract":"<jats:p>Intelligent deep learning-based models have made significant progress for automated source code semantics embedding, and current research works mainly leverage natural language-based methods and graph-based methods. However, natural language-based methods do not capture the rich semantic structural information of source code, and graph-based methods do not utilize rich distant information of source code due to the high cost of message-passing steps.<\/jats:p>\n          <jats:p>In this article, we propose a novel interpretable model, called graph tensor convolution neural network (GTCN), to generate accurate code embedding, which is capable of comprehensively capturing the distant information of code sequences and rich code semantics structural information. First, we propose to utilize a high-dimensional tensor to integrate various heterogeneous code graphs with node sequence features, such as control flow, data flow. Second, inspired by the current advantages of graph-based deep learning and efficient tensor computations, we propose a novel interpretable graph tensor convolution neural network for learning accurate code semantic embedding from the code graph tensor. Finally, we evaluate three popular applications on the GTCN model: variable misuse detection, source code prediction, and vulnerability detection. Compared with current state-of-the-art methods, our model achieves higher scores with respect to the top-1 accuracy while costing less training time.<\/jats:p>","DOI":"10.1145\/3582574","type":"journal-article","created":{"date-parts":[[2023,2,20]],"date-time":"2023-02-20T11:48:26Z","timestamp":1676893706000},"page":"1-40","update-policy":"https:\/\/doi.org\/10.1145\/crossmark-policy","source":"Crossref","is-referenced-by-count":4,"title":["Toward Interpretable Graph Tensor Convolution Neural Network for Code Semantics Embedding"],"prefix":"10.1145","volume":"32","author":[{"ORCID":"https:\/\/orcid.org\/0000-0003-1469-0789","authenticated-orcid":false,"given":"Jia","family":"Yang","sequence":"first","affiliation":[{"name":"Hubei Key Laboratory of Distributed System Security, Hubei EngineeringResearch Center on Big Data Security, School of Cyber Science and Engineering, Huazhong University of Science and Technology, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-4536-3537","authenticated-orcid":false,"given":"Cai","family":"Fu","sequence":"additional","affiliation":[{"name":"Hubei Key Laboratory of Distributed System Security, Hubei EngineeringResearch Center on Big Data Security, School of Cyber Science and Engineering, Huazhong University of Science and Technology, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0003-3352-4065","authenticated-orcid":false,"given":"Fengyang","family":"Deng","sequence":"additional","affiliation":[{"name":"Huazhong University of Science and Technology, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-5588-9618","authenticated-orcid":false,"given":"Ming","family":"Wen","sequence":"additional","affiliation":[{"name":"Huazhong University of Science and Technology, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-7239-7471","authenticated-orcid":false,"given":"Xiaowei","family":"Guo","sequence":"additional","affiliation":[{"name":"Huazhong University of Science and Technology"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-5428-290X","authenticated-orcid":false,"given":"Chuanhao","family":"Wan","sequence":"additional","affiliation":[{"name":"Huazhong University of Science and Technology"}]}],"member":"320","published-online":{"date-parts":[[2023,7,21]]},"reference":[{"key":"e_1_3_2_2_2","unstructured":"Yaqin Zhou. 2019. Source codes of the paper: Devign: Effective vulnerability identification by learning comprehensive program semantics via graph neural networks. https:\/\/github.com\/epicosy\/devign."},{"key":"e_1_3_2_3_2","doi-asserted-by":"crossref","unstructured":"Yu Wang. 2020. Source codes of the paper: Learning semantic program embeddings with graph interval neural network. https:\/\/github.com\/GINN-Imp\/GINN.","DOI":"10.1145\/3428205"},{"key":"e_1_3_2_4_2","unstructured":"Vincent J. Hellendoorn. 2020. Source codes of the paper: Global relational models of source code. https:\/\/github.com\/VHellendoorn\/ICLR20-Great."},{"key":"e_1_3_2_5_2","doi-asserted-by":"crossref","unstructured":"Zhangyin Feng. 2021. Source codes of the paper: CodeBERT: A pre-trained model for programming and natural languages. https:\/\/github.com\/microsoft\/CodeBERT.","DOI":"10.18653\/v1\/2020.findings-emnlp.139"},{"key":"e_1_3_2_6_2","unstructured":"Jia Yang. 2022. Source codes of this paper. https:\/\/gitee.com\/cse-sss\/GTCN."},{"key":"e_1_3_2_7_2","unstructured":"Jia Yang. 2022. Source codes of this paper. https:\/\/github.com\/SmileResearch\/GTCN."},{"key":"e_1_3_2_8_2","doi-asserted-by":"crossref","unstructured":"Yi Li. 2021. Source codes of the paper: Vulnerability detection with fine-grained interpretations. https:\/\/github.com\/vulnerabilitydetection\/VulnerabilityDetectionResearch.","DOI":"10.1145\/3468264.3468597"},{"key":"e_1_3_2_9_2","volume-title":"6th International Conference on Learning Representations","author":"Allamanis Miltiadis","year":"2018","unstructured":"Miltiadis Allamanis, Marc Brockschmidt, and Mahmoud Khademi. 2018. Learning to represent programs with graphs. In 6th International Conference on Learning Representations."},{"key":"e_1_3_2_10_2","article-title":"Sequence model design for code completion in the modern IDE","volume":"2004","author":"Aye Gareth Ari","year":"2020","unstructured":"Gareth Ari Aye and Gail E. Kaiser. 2020. Sequence model design for code completion in the modern IDE. CoRR abs\/2004.05249 (2020).","journal-title":"CoRR"},{"key":"e_1_3_2_11_2","doi-asserted-by":"publisher","DOI":"10.1126\/science.1197448"},{"key":"e_1_3_2_12_2","doi-asserted-by":"publisher","DOI":"10.1145\/2976749.2978422"},{"key":"e_1_3_2_13_2","first-page":"2933","volume-title":"33rd International Conference on Machine Learning","author":"Bielik Pavol","year":"2016","unstructured":"Pavol Bielik, Veselin Raychev, and Martin T. Vechev. 2016. PHOG: Probabilistic model for code. In 33rd International Conference on Machine Learning. 2933\u20132942."},{"key":"e_1_3_2_14_2","first-page":"1204","volume-title":"38th International Conference on Machine Learning","author":"Cai Tianle","year":"2021","unstructured":"Tianle Cai, Shengjie Luo, and Keyulu Xu. 2021. GraphNorm: A principled approach to accelerating graph neural network training. In 38th International Conference on Machine Learning. 1204\u20131215."},{"key":"e_1_3_2_15_2","doi-asserted-by":"publisher","DOI":"10.1145\/3510003.3510219"},{"key":"e_1_3_2_16_2","doi-asserted-by":"publisher","DOI":"10.1109\/JSTSP.2021.3058019"},{"key":"e_1_3_2_17_2","doi-asserted-by":"publisher","DOI":"10.1145\/3436877"},{"key":"e_1_3_2_18_2","doi-asserted-by":"publisher","DOI":"10.1109\/TNN.1997.641482"},{"key":"e_1_3_2_19_2","doi-asserted-by":"publisher","DOI":"10.1109\/MSP.2013.2297439"},{"key":"e_1_3_2_20_2","first-page":"1536","volume-title":"Findings of the Association for Computational Linguistics: EMNLP 2020, Online Event, 16\u201320 November 2020 (Findings of ACL)","author":"Feng Zhangyin","year":"2020","unstructured":"Zhangyin Feng, Daya Guo, Duyu Tanget al.2020. CodeBERT: A pre-trained model for programming and natural languages. In Findings of the Association for Computational Linguistics: EMNLP 2020, Online Event, 16\u201320 November 2020 (Findings of ACL), Vol. EMNLP 2020. Association for Computational Linguistics, 1536\u20131547."},{"key":"e_1_3_2_21_2","volume-title":"7th International Conference on Learning Representations","author":"Fernandes Patrick","year":"2019","unstructured":"Patrick Fernandes, Miltiadis Allamanis, and Marc Brockschmidt. 2019. Structured neural summarization. In 7th International Conference on Learning Representations. OpenReview.net."},{"key":"e_1_3_2_22_2","unstructured":"Gauthier Gidel Tony Jebara and Simon Lacoste-Julien. 2017. Frank-Wolfe algorithms for Saddle point problems. In 20th International Conference on Artificial Intelligence and Statistics AISTATS . 362\u2013371."},{"key":"e_1_3_2_23_2","first-page":"1","volume-title":"39th ACM Symposium on Principles of Database Systems","author":"Grohe Martin","year":"2020","unstructured":"Martin Grohe. 2020. word2vec, node2vec, graph2vec, X2vec: Towards a theory of vector embeddings of structured data. In 39th ACM Symposium on Principles of Database Systems. 1\u201316."},{"key":"e_1_3_2_24_2","volume-title":"9th International Conference on Learning Representations","author":"Guo Daya","year":"2021","unstructured":"Daya Guo, Shuo Ren, and Shuai Lu. 2021. GraphCodeBERT: Pre-training code representations with data flow. In 9th International Conference on Learning Representations. OpenReview.net."},{"key":"e_1_3_2_25_2","doi-asserted-by":"publisher","DOI":"10.1609\/aaai.v31i1.10886"},{"key":"e_1_3_2_26_2","first-page":"67","volume-title":"Joint Proceedings of SEED & NLPaSE co-located with 27th Asia Pacific Software Engineering Conference","author":"Hanayama Kaisei","year":"2020","unstructured":"Kaisei Hanayama, Shinsuke Matsumoto, and Shinji Kusumoto. 2020. Humpback: Code completion system for Dockerfile based on language models (short paper). In Joint Proceedings of SEED & NLPaSE co-located with 27th Asia Pacific Software Engineering Conference. 67\u201373."},{"issue":"1","key":"e_1_3_2_27_2","first-page":"1","article-title":"The problem of overfitting","volume":"44","author":"Hawkins Douglas M.","year":"2004","unstructured":"Douglas M. Hawkins. 2004. The problem of overfitting. J. Chem. Inf. Model. 44, 1 (2004), 1\u201312.","journal-title":"J. Chem. Inf. Model."},{"key":"e_1_3_2_28_2","volume-title":"8th International Conference on Learning Representations","author":"Hellendoorn Vincent J.","year":"2020","unstructured":"Vincent J. Hellendoorn, Charles Sutton, Rishabh Singh, Petros Maniatis, and David Bieber. 2020. Global relational models of source code. In 8th International Conference on Learning Representations. OpenReview.net."},{"key":"e_1_3_2_29_2","doi-asserted-by":"publisher","DOI":"10.5555\/2337223.2337322"},{"key":"e_1_3_2_30_2","doi-asserted-by":"publisher","DOI":"10.1162\/neco_a_01296"},{"key":"e_1_3_2_31_2","doi-asserted-by":"publisher","DOI":"10.21437\/Interspeech.2020-2909"},{"key":"e_1_3_2_32_2","doi-asserted-by":"publisher","DOI":"10.24963\/ijcai.2020\/175"},{"key":"e_1_3_2_33_2","first-page":"427","volume-title":"30th International Conference on Machine Learning","author":"Jaggi Martin","year":"2013","unstructured":"Martin Jaggi. 2013. Revisiting Frank-Wolfe: Projection-free sparse convex optimization. In 30th International Conference on Machine Learning. 427\u2013435."},{"key":"e_1_3_2_34_2","article-title":"Differentially private matrix completion, revisited","volume":"1712","author":"Jain Prateek","year":"2017","unstructured":"Prateek Jain, Om Thakkar, and Abhradeep Thakurta. 2017. Differentially private matrix completion, revisited. CoRR abs\/1712.09765 (2017).","journal-title":"CoRR"},{"key":"e_1_3_2_35_2","doi-asserted-by":"publisher","DOI":"10.1007\/s11063-018-9793-9"},{"key":"e_1_3_2_36_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.patrec.2013.05.018"},{"key":"e_1_3_2_37_2","doi-asserted-by":"publisher","DOI":"10.1109\/TIP.2019.2946098"},{"key":"e_1_3_2_38_2","doi-asserted-by":"publisher","DOI":"10.1145\/3377811.3380342"},{"key":"e_1_3_2_39_2","doi-asserted-by":"publisher","DOI":"10.1145\/3424144"},{"key":"e_1_3_2_40_2","doi-asserted-by":"publisher","DOI":"10.1007\/s12652-020-01832-3"},{"key":"e_1_3_2_41_2","article-title":"Lower and upper bounds on the VC-dimension of tensor network models","volume":"2106","author":"Khavari Behnoush","year":"2021","unstructured":"Behnoush Khavari and Guillaume Rabusseau. 2021. Lower and upper bounds on the VC-dimension of tensor network models. CoRR abs\/2106.11827 (2021).","journal-title":"CoRR"},{"key":"e_1_3_2_42_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.laa.2010.09.020"},{"key":"e_1_3_2_43_2","doi-asserted-by":"publisher","DOI":"10.1109\/ICSE43902.2021.00026"},{"key":"e_1_3_2_44_2","volume-title":"5th International Conference on Learning Representations","author":"Kipf Thomas N.","year":"2017","unstructured":"Thomas N. Kipf and Max Welling. 2017. Semi-supervised classification with graph convolutional networks. In 5th International Conference on Learning Representations. OpenReview.net."},{"key":"e_1_3_2_45_2","doi-asserted-by":"publisher","DOI":"10.1109\/TGRS.2002.1006358"},{"key":"e_1_3_2_46_2","doi-asserted-by":"publisher","DOI":"10.5555\/3304222.3304348"},{"key":"e_1_3_2_47_2","doi-asserted-by":"publisher","DOI":"10.1109\/DSC50466.2020.00066"},{"key":"e_1_3_2_48_2","volume-title":"4th International Conference on Learning Representations","author":"Li Yujia","year":"2016","unstructured":"Yujia Li, Daniel Tarlow, Marc Brockschmidt, and Richard S. Zemel. 2016. Gated graph sequence neural networks. In 4th International Conference on Learning Representations."},{"key":"e_1_3_2_49_2","doi-asserted-by":"publisher","DOI":"10.1145\/3468264.3468597"},{"key":"e_1_3_2_50_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.future.2020.10.018"},{"issue":"1","key":"e_1_3_2_51_2","first-page":"6","article-title":"A survey of feature extraction and classifier design based on tensor pattern","volume":"39","author":"Li-mei Zhang","year":"2009","unstructured":"Zhang Li-mei, Qiao Li-shan, and Chen Song-can. 2009. A survey of feature extraction and classifier design based on tensor pattern. J. Shandong Univ. (Eng. Sci.) 39, 1 (2009), 6\u201314.","journal-title":"J. Shandong Univ. (Eng. Sci.)"},{"key":"e_1_3_2_52_2","doi-asserted-by":"publisher","DOI":"10.1145\/3387904.3389261"},{"key":"e_1_3_2_53_2","doi-asserted-by":"publisher","DOI":"10.1145\/3324884.3416591"},{"key":"e_1_3_2_54_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.patcog.2010.05.009"},{"key":"e_1_3_2_55_2","doi-asserted-by":"publisher","DOI":"10.1609\/aaai.v34i05.6359"},{"key":"e_1_3_2_56_2","article-title":"Tracking translation invariance in CNNs","volume":"2104","author":"Myburgh Johannes C.","year":"2021","unstructured":"Johannes C. Myburgh, Coenraad Mouton, and Marelie H. Davel. 2021. Tracking translation invariance in CNNs. CoRR abs\/2104.05997 (2021).","journal-title":"CoRR"},{"key":"e_1_3_2_57_2","doi-asserted-by":"publisher","DOI":"10.1109\/ICSE.2015.336"},{"key":"e_1_3_2_58_2","doi-asserted-by":"publisher","DOI":"10.5555\/2337223.2337232"},{"key":"e_1_3_2_59_2","doi-asserted-by":"publisher","DOI":"10.1145\/3377811.3380926"},{"key":"e_1_3_2_60_2","article-title":"Overcoming overfitting and large weight update problem in linear rectifiers: Thresholded exponential rectified linear units","volume":"2006","author":"Pandey Vijay","year":"2020","unstructured":"Vijay Pandey. 2020. Overcoming overfitting and large weight update problem in linear rectifiers: Thresholded exponential rectified linear units. CoRR abs\/2006.02797 (2020).","journal-title":"CoRR"},{"key":"e_1_3_2_61_2","doi-asserted-by":"publisher","DOI":"10.1145\/1858996.1859089"},{"key":"e_1_3_2_62_2","article-title":"secureTF: A secure TensorFlow framework","volume":"2101","author":"Quoc Do Le","year":"2021","unstructured":"Do Le Quoc, Franz Gregor, Sergei Arnautov, Roland Kunkel, Pramod Bhatotia, and Christof Fetzer. 2021. secureTF: A secure TensorFlow framework. CoRR abs\/2101.08204 (2021).","journal-title":"CoRR"},{"key":"e_1_3_2_63_2","doi-asserted-by":"crossref","unstructured":"Md. Mostafizer Rahman Yutaka Watanobe and Keita Nakamura. 2020. A neural network based intelligent support model for program code completion. Sci. Program. 2020 7426461 (2020) 1\u201318.","DOI":"10.1155\/2020\/7426461"},{"key":"e_1_3_2_64_2","doi-asserted-by":"publisher","DOI":"10.1049\/iet-cvi.2017.0469"},{"key":"e_1_3_2_65_2","doi-asserted-by":"publisher","DOI":"10.1145\/3397537.3398483"},{"key":"e_1_3_2_66_2","doi-asserted-by":"publisher","DOI":"10.1109\/ICMLA.2018.00120"},{"key":"e_1_3_2_67_2","doi-asserted-by":"publisher","DOI":"10.1007\/978-3-319-93417-4_38"},{"key":"e_1_3_2_68_2","volume-title":"54th Annual Meeting of the Association for Computational Linguistics, ACL","author":"Sennrich Rico","year":"2016","unstructured":"Rico Sennrich, Barry Haddow, and Alexandra Birch. 2016. Neural machine translation of rare words with subword units. In 54th Annual Meeting of the Association for Computational Linguistics, ACL."},{"key":"e_1_3_2_69_2","doi-asserted-by":"publisher","DOI":"10.1109\/LCOMM.2020.3025298"},{"key":"e_1_3_2_70_2","first-page":"2727","volume-title":"25th ACM International Conference on Knowledge Discovery & Data Mining","author":"Svyatkovskiy Alexey","year":"2019","unstructured":"Alexey Svyatkovskiy, Ying Zhao, Shengyu Fu, and Neel Sundaresan. 2019. In 25th ACM International Conference on Knowledge Discovery & Data Mining. 2727\u20132735."},{"key":"e_1_3_2_71_2","first-page":"448","volume-title":"International Joint Conference on Neural Networks","author":"Tjandra Andros","year":"2016","unstructured":"Andros Tjandra, Sakriani Sakti, and Ruli Manurung. 2016. Gated recurrent neural tensor network. In International Joint Conference on Neural Networks. 448\u2013455."},{"key":"e_1_3_2_72_2","unstructured":"S. Waner. 1986. Introduction to differential geometry and general relativity. Lecture Notes by Stefan Waner with a Special Guest Lecture by Gregory C. Levine Department of Mathematics Hofstra University https:\/\/medusa.teodesian.net\/docs\/mathematics\/Intro%20to%20Differential%20Geometry%20and%20General%20Relativity%20-%20S.%20Warner%20(2002)%20WW.pdf."},{"key":"e_1_3_2_73_2","doi-asserted-by":"publisher","DOI":"10.1109\/TIFS.2020.3044773"},{"key":"e_1_3_2_74_2","unstructured":"Wenhan Wang Kechi Zhang Ge Li and Zhi Jin. 2020. Learning to Represent Programs with Heterogeneous Graphs. (2020). arxiv:cs.SE\/2012.04188"},{"key":"e_1_3_2_75_2","doi-asserted-by":"publisher","DOI":"10.1145\/3428205"},{"key":"e_1_3_2_76_2","volume-title":"10th International Conference on Learning Representations","author":"Wu Yingxin","year":"2022","unstructured":"Yingxin Wu, Xiang Wang, An Zhang, Xiangnan He, and Tat-Seng Chua. 2022. Discovering invariant rationales for graph neural networks. In 10th International Conference on Learning Representations. OpenReview.net."},{"key":"e_1_3_2_77_2","doi-asserted-by":"publisher","DOI":"10.1109\/ICDM51629.2021.00185"},{"key":"e_1_3_2_78_2","first-page":"590","volume-title":"IEEE Symposium on Security and Privacy","author":"Yamaguchi Fabian","year":"2014","unstructured":"Fabian Yamaguchi, Nico Golde, Daniel Arp, and Konrad Rieck. 2014. Modeling and discovering vulnerabilities with code property graphs. In IEEE Symposium on Security and Privacy. 590\u2013604."},{"key":"e_1_3_2_79_2","first-page":"11960","volume-title":"Annual Conference on Neural Information Processing Systems","author":"Yun Seongjun","year":"2019","unstructured":"Seongjun Yun, Minbyul Jeong, Raehyun Kimet al.2019. Graph transformer networks. In Annual Conference on Neural Information Processing Systems. 11960\u201311970."},{"key":"e_1_3_2_80_2","doi-asserted-by":"publisher","DOI":"10.1016\/j.patcog.2009.01.010"},{"key":"e_1_3_2_81_2","doi-asserted-by":"publisher","DOI":"10.1109\/TETCI.2021.3100641"},{"key":"e_1_3_2_82_2","first-page":"10197","volume-title":"Annual Conference on Neural Information Processing Systems","author":"Zhou Yaqin","year":"2019","unstructured":"Yaqin Zhou, Shangqing Liu, Jing Kai Siowet al.2019. Devign: Effective vulnerability identification by learning comprehensive program semantics via graph neural networks. In Annual Conference on Neural Information Processing Systems. 10197\u201310207."}],"container-title":["ACM Transactions on Software Engineering and Methodology"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3582574","content-type":"unspecified","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/dl.acm.org\/doi\/pdf\/10.1145\/3582574","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,6,17]],"date-time":"2025-06-17T18:09:13Z","timestamp":1750183753000},"score":1,"resource":{"primary":{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3582574"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2023,7,21]]},"references-count":81,"journal-issue":{"issue":"5","published-print":{"date-parts":[[2023,9,30]]}},"alternative-id":["10.1145\/3582574"],"URL":"https:\/\/doi.org\/10.1145\/3582574","relation":{},"ISSN":["1049-331X","1557-7392"],"issn-type":[{"value":"1049-331X","type":"print"},{"value":"1557-7392","type":"electronic"}],"subject":[],"published":{"date-parts":[[2023,7,21]]},"assertion":[{"value":"2022-07-18","order":0,"name":"received","label":"Received","group":{"name":"publication_history","label":"Publication History"}},{"value":"2023-01-04","order":1,"name":"accepted","label":"Accepted","group":{"name":"publication_history","label":"Publication History"}},{"value":"2023-07-21","order":2,"name":"published","label":"Published","group":{"name":"publication_history","label":"Publication History"}}]}}