{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,6,18]],"date-time":"2025-06-18T04:10:51Z","timestamp":1750219851056,"version":"3.41.0"},"publisher-location":"New York, NY, USA","reference-count":47,"publisher":"ACM","license":[{"start":{"date-parts":[[2022,11,25]],"date-time":"2022-11-25T00:00:00Z","timestamp":1669334400000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/www.acm.org\/publications\/policies\/copyright_policy#Background"}],"content-domain":{"domain":["dl.acm.org"],"crossmark-restriction":true},"short-container-title":[],"published-print":{"date-parts":[[2022,11,25]]},"DOI":"10.1145\/3578339.3578341","type":"proceedings-article","created":{"date-parts":[[2023,3,13]],"date-time":"2023-03-13T22:06:39Z","timestamp":1678745199000},"page":"6-13","update-policy":"https:\/\/doi.org\/10.1145\/crossmark-policy","source":"Crossref","is-referenced-by-count":0,"title":["Attention focuses on node information: An improved Graph Transformer for traffic prediction"],"prefix":"10.1145","author":[{"ORCID":"https:\/\/orcid.org\/0000-0002-3631-6459","authenticated-orcid":false,"given":"Zhenhong","family":"Liu","sequence":"first","affiliation":[{"name":"Huaqiao University, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-3468-5242","authenticated-orcid":false,"given":"Xiaoguang","family":"Yu","sequence":"additional","affiliation":[{"name":"Huaqiao University, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0002-5108-0835","authenticated-orcid":false,"given":"Xiaodong","family":"Xie","sequence":"additional","affiliation":[{"name":"Huaqiao University, China"}]},{"ORCID":"https:\/\/orcid.org\/0000-0001-9392-6213","authenticated-orcid":false,"given":"Xia","family":"Chen","sequence":"additional","affiliation":[{"name":"Huaqiao University, China"}]}],"member":"320","published-online":{"date-parts":[[2023,3,13]]},"reference":[{"key":"e_1_3_2_1_1_1","unstructured":"Mohammed\u00a0S Ahmed and Allen\u00a0R Cook. 1979. Analysis of freeway traffic time-series data by using Box-Jenkins techniques. Number 722.  Mohammed\u00a0S Ahmed and Allen\u00a0R Cook. 1979. Analysis of freeway traffic time-series data by using Box-Jenkins techniques. Number 722."},{"key":"e_1_3_2_1_2_1","doi-asserted-by":"publisher","DOI":"10.1061\/(ASCE)0733-947X(2003)129:6(664)"},{"key":"e_1_3_2_1_3_1","doi-asserted-by":"publisher","DOI":"10.1145\/3209978.3210006"},{"key":"e_1_3_2_1_4_1","doi-asserted-by":"publisher","DOI":"10.1080\/00031305.2017.1380080"},{"key":"e_1_3_2_1_5_1","unstructured":"Shaojie Bai J\u00a0Zico Kolter and Vladlen Koltun. 2018. An empirical evaluation of generic convolutional and recurrent networks for sequence modeling. arXiv preprint arXiv:1803.01271(2018).  Shaojie Bai J\u00a0Zico Kolter and Vladlen Koltun. 2018. An empirical evaluation of generic convolutional and recurrent networks for sequence modeling. arXiv preprint arXiv:1803.01271(2018)."},{"key":"e_1_3_2_1_6_1","volume-title":"Attention is all you need. Advances in neural information processing systems 30","author":"Vaswani Ashish","year":"2017","unstructured":"Ashish Vaswani , Noam Shazeer , Niki Parmar , Jakob Uszkoreit , Llion Jones , Aidan\u00a0 N Gomez , \u0141ukasz Kaiser , and Illia Polosukhin . 2017. Attention is all you need. Advances in neural information processing systems 30 ( 2017 ). Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit, Llion Jones, Aidan\u00a0N Gomez, \u0141ukasz Kaiser, and Illia Polosukhin. 2017. Attention is all you need. Advances in neural information processing systems 30 (2017)."},{"key":"e_1_3_2_1_7_1","unstructured":"Joan Bruna Wojciech Zaremba Arthur Szlam and Yann LeCun. 2013. Spectral networks and locally connected networks on graphs. arXiv preprint arXiv:1312.6203(2013).  Joan Bruna Wojciech Zaremba Arthur Szlam and Yann LeCun. 2013. Spectral networks and locally connected networks on graphs. arXiv preprint arXiv:1312.6203(2013)."},{"key":"e_1_3_2_1_8_1","unstructured":"Thomas\u00a0N Kipf and Max Welling. 2016. Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907(2016).  Thomas\u00a0N Kipf and Max Welling. 2016. Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907(2016)."},{"key":"e_1_3_2_1_9_1","volume-title":"Convolutional neural networks on graphs with fast localized spectral filtering. Advances in neural information processing systems 29","author":"Defferrard Micha\u00ebl","year":"2016","unstructured":"Micha\u00ebl Defferrard , Xavier Bresson , and Pierre Vandergheynst . 2016. Convolutional neural networks on graphs with fast localized spectral filtering. Advances in neural information processing systems 29 ( 2016 ). Micha\u00ebl Defferrard, Xavier Bresson, and Pierre Vandergheynst. 2016. Convolutional neural networks on graphs with fast localized spectral filtering. Advances in neural information processing systems 29 (2016)."},{"key":"e_1_3_2_1_10_1","unstructured":"Bingbing Xu Huawei Shen Qi Cao Keting Cen and Xueqi Cheng. 2020. Graph convolutional networks using heat kernel for semi-supervised learning. arXiv preprint arXiv:2007.16002(2020).  Bingbing Xu Huawei Shen Qi Cao Keting Cen and Xueqi Cheng. 2020. Graph convolutional networks using heat kernel for semi-supervised learning. arXiv preprint arXiv:2007.16002(2020)."},{"key":"e_1_3_2_1_11_1","unstructured":"Yaguang Li Rose Yu Cyrus Shahabi and Yan Liu. 2017. Diffusion convolutional recurrent neural network: Data-driven traffic forecasting. arXiv preprint arXiv:1707.01926(2017).  Yaguang Li Rose Yu Cyrus Shahabi and Yan Liu. 2017. Diffusion convolutional recurrent neural network: Data-driven traffic forecasting. arXiv preprint arXiv:1707.01926(2017)."},{"key":"e_1_3_2_1_12_1","first-page":"21618","article-title":"Rethinking graph transformers with spectral attention","volume":"34","author":"Kreuzer Devin","year":"2021","unstructured":"Devin Kreuzer , Dominique Beaini , Will Hamilton , Vincent L\u00e9tourneau , and Prudencio Tossou . 2021 . Rethinking graph transformers with spectral attention . Advances in Neural Information Processing Systems 34 (2021), 21618 \u2013 21629 . Devin Kreuzer, Dominique Beaini, Will Hamilton, Vincent L\u00e9tourneau, and Prudencio Tossou. 2021. Rethinking graph transformers with spectral attention. Advances in Neural Information Processing Systems 34 (2021), 21618\u201321629.","journal-title":"Advances in Neural Information Processing Systems"},{"key":"e_1_3_2_1_13_1","doi-asserted-by":"publisher","DOI":"10.1609\/aaai.v33i01.33011004"},{"key":"e_1_3_2_1_14_1","unstructured":"Huaxiu Yao Xianfeng Tang Hua Wei Guanjie Zheng Yanwei Yu and Zhenhui Li. 2018. Modeling spatial-temporal dynamics for traffic prediction. arXiv preprint arXiv:1803.01254(2018) 922\u2013929.  Huaxiu Yao Xianfeng Tang Hua Wei Guanjie Zheng Yanwei Yu and Zhenhui Li. 2018. Modeling spatial-temporal dynamics for traffic prediction. arXiv preprint arXiv:1803.01254(2018) 922\u2013929."},{"key":"e_1_3_2_1_15_1","unstructured":"Minhao Liu Ailing Zeng Zhijian Xu Qiuxia Lai and Qiang Xu. 2021. Time series is a special sequence: Forecasting with sample convolution and interaction. arXiv preprint arXiv:2106.09305(2021).  Minhao Liu Ailing Zeng Zhijian Xu Qiuxia Lai and Qiang Xu. 2021. Time series is a special sequence: Forecasting with sample convolution and interaction. arXiv preprint arXiv:2106.09305(2021)."},{"key":"e_1_3_2_1_16_1","doi-asserted-by":"publisher","DOI":"10.1609\/aaai.v34i01.5438"},{"key":"e_1_3_2_1_17_1","doi-asserted-by":"publisher","DOI":"10.1609\/aaai.v35i12.17325"},{"key":"e_1_3_2_1_18_1","unstructured":"Tian Zhou Ziqing Ma Qingsong Wen Xue Wang Liang Sun and Rong Jin. 2022. FEDformer: Frequency enhanced decomposed transformer for long-term series forecasting. arXiv preprint arXiv:2201.12740(2022).  Tian Zhou Ziqing Ma Qingsong Wen Xue Wang Liang Sun and Rong Jin. 2022. FEDformer: Frequency enhanced decomposed transformer for long-term series forecasting. arXiv preprint arXiv:2201.12740(2022)."},{"key":"e_1_3_2_1_19_1","first-page":"22419","article-title":"Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting","volume":"34","author":"Wu Haixu","year":"2021","unstructured":"Haixu Wu , Jiehui Xu , Jianmin Wang , and Mingsheng Long . 2021 . Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting . Advances in Neural Information Processing Systems 34 (2021), 22419 \u2013 22430 . Haixu Wu, Jiehui Xu, Jianmin Wang, and Mingsheng Long. 2021. Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting. Advances in Neural Information Processing Systems 34 (2021), 22419\u201322430.","journal-title":"Advances in Neural Information Processing Systems"},{"key":"e_1_3_2_1_20_1","volume-title":"International Conference on Learning Representations.","author":"Liu Shizhan","year":"2021","unstructured":"Shizhan Liu , Hang Yu , Cong Liao , Jianguo Li , Weiyao Lin , Alex\u00a0 X Liu , and Schahram Dustdar . 2021 . Pyraformer: Low-complexity pyramidal attention for long-range time series modeling and forecasting . In International Conference on Learning Representations. Shizhan Liu, Hang Yu, Cong Liao, Jianguo Li, Weiyao Lin, Alex\u00a0X Liu, and Schahram Dustdar. 2021. Pyraformer: Low-complexity pyramidal attention for long-range time series modeling and forecasting. In International Conference on Learning Representations."},{"key":"e_1_3_2_1_21_1","unstructured":"Ailing Zeng Muxi Chen Lei Zhang and Qiang Xu. 2022. Are Transformers Effective for Time Series Forecasting?arXiv preprint arXiv:2205.13504(2022).  Ailing Zeng Muxi Chen Lei Zhang and Qiang Xu. 2022. Are Transformers Effective for Time Series Forecasting?arXiv preprint arXiv:2205.13504(2022)."},{"key":"e_1_3_2_1_22_1","unstructured":"Bing Yu Haoteng Yin and Zhanxing Zhu. 2017. Spatio-temporal graph convolutional networks: A deep learning framework for traffic forecasting. arXiv preprint arXiv:1709.04875(2017).  Bing Yu Haoteng Yin and Zhanxing Zhu. 2017. Spatio-temporal graph convolutional networks: A deep learning framework for traffic forecasting. arXiv preprint arXiv:1709.04875(2017)."},{"key":"e_1_3_2_1_23_1","doi-asserted-by":"publisher","DOI":"10.1609\/aaai.v33i01.3301922"},{"key":"e_1_3_2_1_24_1","volume-title":"LSGCN: Long Short-Term Traffic Prediction with Graph Convolutional Networks.. In IJCAI. 2355\u20132361.","author":"Huang Rongzhou","year":"2020","unstructured":"Rongzhou Huang , Chuyin Huang , Yubao Liu , Genan Dai , and Weiyang Kong . 2020 . LSGCN: Long Short-Term Traffic Prediction with Graph Convolutional Networks.. In IJCAI. 2355\u20132361. Rongzhou Huang, Chuyin Huang, Yubao Liu, Genan Dai, and Weiyang Kong. 2020. LSGCN: Long Short-Term Traffic Prediction with Graph Convolutional Networks.. In IJCAI. 2355\u20132361."},{"key":"e_1_3_2_1_25_1","unstructured":"Zonghan Wu Shirui Pan Guodong Long Jing Jiang and Chengqi Zhang. 2019. Graph wavenet for deep spatial-temporal graph modeling. arXiv preprint arXiv:1906.00121(2019).  Zonghan Wu Shirui Pan Guodong Long Jing Jiang and Chengqi Zhang. 2019. Graph wavenet for deep spatial-temporal graph modeling. arXiv preprint arXiv:1906.00121(2019)."},{"key":"e_1_3_2_1_26_1","volume-title":"Adaptive graph convolutional recurrent network for traffic forecasting. Advances in neural information processing systems 33","author":"Bai Lei","year":"2020","unstructured":"Lei Bai , Lina Yao , Can Li , Xianzhi Wang , and Can Wang . 2020. Adaptive graph convolutional recurrent network for traffic forecasting. Advances in neural information processing systems 33 ( 2020 ), 17804\u201317815. Lei Bai, Lina Yao, Can Li, Xianzhi Wang, and Can Wang. 2020. Adaptive graph convolutional recurrent network for traffic forecasting. Advances in neural information processing systems 33 (2020), 17804\u201317815."},{"key":"e_1_3_2_1_27_1","volume-title":"Time-Series","author":"Anderson D.","year":"1976","unstructured":"Oliver\u00a0 D. Anderson and M.\u00a0 G. Kendall . 1976. Time-Series . 2 nd edn.The Statistician 25( 1976 ), 308. Oliver\u00a0D. Anderson and M.\u00a0G. Kendall. 1976. Time-Series. 2nd edn.The Statistician 25(1976), 308.","edition":"2"},{"key":"e_1_3_2_1_28_1","first-page":"3","article-title":"STL: A seasonal-trend decomposition","volume":"6","author":"Cleveland B","year":"1990","unstructured":"Robert\u00a0 B Cleveland , William\u00a0 S Cleveland , Jean\u00a0 E McRae , and Irma Terpenning . 1990 . STL: A seasonal-trend decomposition . J. Off. Stat 6 , 1 (1990), 3 \u2013 73 . Robert\u00a0B Cleveland, William\u00a0S Cleveland, Jean\u00a0E McRae, and Irma Terpenning. 1990. STL: A seasonal-trend decomposition. J. Off. Stat 6, 1 (1990), 3\u201373.","journal-title":"J. Off. Stat"},{"volume-title":"Seasonal adjustment methods and real time trend-cycle estimation","author":"Dagum Estela\u00a0Bee","key":"e_1_3_2_1_29_1","unstructured":"Estela\u00a0Bee Dagum and Silvia Bianconcini . 2016. Seasonal adjustment methods and real time trend-cycle estimation . Springer . Estela\u00a0Bee Dagum and Silvia Bianconcini. 2016. Seasonal adjustment methods and real time trend-cycle estimation. Springer."},{"key":"e_1_3_2_1_30_1","doi-asserted-by":"publisher","DOI":"10.1609\/aaai.v35i5.16542"},{"key":"e_1_3_2_1_31_1","unstructured":"Vijay\u00a0Prakash Dwivedi and Xavier Bresson. 2020. A generalization of transformer networks to graphs. arXiv preprint arXiv:2012.09699(2020).  Vijay\u00a0Prakash Dwivedi and Xavier Bresson. 2020. A generalization of transformer networks to graphs. arXiv preprint arXiv:2012.09699(2020)."},{"key":"e_1_3_2_1_32_1","volume-title":"Do transformers really perform badly for graph representation?Advances in Neural Information Processing Systems 34","author":"Ying Chengxuan","year":"2021","unstructured":"Chengxuan Ying , Tianle Cai , Shengjie Luo , Shuxin Zheng , Guolin Ke , Di He , Yanming Shen , and Tie-Yan Liu . 2021. Do transformers really perform badly for graph representation?Advances in Neural Information Processing Systems 34 ( 2021 ), 28877\u201328888. Chengxuan Ying, Tianle Cai, Shengjie Luo, Shuxin Zheng, Guolin Ke, Di He, Yanming Shen, and Tie-Yan Liu. 2021. Do transformers really perform badly for graph representation?Advances in Neural Information Processing Systems 34 (2021), 28877\u201328888."},{"key":"e_1_3_2_1_33_1","unstructured":"Jinheon Baek Minki Kang and Sung\u00a0Ju Hwang. 2021. Accurate learning of graph representations with graph multiset pooling. arXiv preprint arXiv:2102.11533(2021).  Jinheon Baek Minki Kang and Sung\u00a0Ju Hwang. 2021. Accurate learning of graph representations with graph multiset pooling. arXiv preprint arXiv:2102.11533(2021)."},{"key":"e_1_3_2_1_34_1","volume-title":"Graph-bert: Only attention is needed for learning graph representations. arXiv preprint arXiv:2001.05140(2020).","author":"Zhang Jiawei","year":"2020","unstructured":"Jiawei Zhang , Haopeng Zhang , Congying Xia , and Li Sun . 2020 . Graph-bert: Only attention is needed for learning graph representations. arXiv preprint arXiv:2001.05140(2020). Jiawei Zhang, Haopeng Zhang, Congying Xia, and Li Sun. 2020. Graph-bert: Only attention is needed for learning graph representations. arXiv preprint arXiv:2001.05140(2020)."},{"key":"e_1_3_2_1_35_1","volume-title":"Graph transformer","author":"Li Yuan","year":"2019","unstructured":"Yuan Li , Xiaodan Liang , Zhiting Hu , Yinbo Chen , and Eric\u00a0 P Xing . 2019. Graph transformer , 2019 . In URL https:\/\/openreview. net\/forum. Yuan Li, Xiaodan Liang, Zhiting Hu, Yinbo Chen, and Eric\u00a0P Xing. 2019. Graph transformer, 2019. In URL https:\/\/openreview. net\/forum."},{"key":"e_1_3_2_1_36_1","first-page":"12559","article-title":"Self-supervised graph transformer on large-scale molecular data","volume":"33","author":"Rong Yu","year":"2020","unstructured":"Yu Rong , Yatao Bian , Tingyang Xu , Weiyang Xie , Ying Wei , Wenbing Huang , and Junzhou Huang . 2020 . Self-supervised graph transformer on large-scale molecular data . Advances in Neural Information Processing Systems 33 (2020), 12559 \u2013 12571 . Yu Rong, Yatao Bian, Tingyang Xu, Weiyang Xie, Ying Wei, Wenbing Huang, and Junzhou Huang. 2020. Self-supervised graph transformer on large-scale molecular data. Advances in Neural Information Processing Systems 33 (2020), 12559\u201312571.","journal-title":"Advances in Neural Information Processing Systems"},{"key":"e_1_3_2_1_37_1","volume-title":"The emerging field of signal processing on graphs: Extending high-dimensional data analysis to networks and other irregular domains","author":"Shuman I","year":"2013","unstructured":"David\u00a0 I Shuman , Sunil\u00a0 K Narang , Pascal Frossard , Antonio Ortega , and Pierre Vandergheynst . 2013. The emerging field of signal processing on graphs: Extending high-dimensional data analysis to networks and other irregular domains . IEEE signal processing magazine 30, 3 ( 2013 ), 83\u201398. David\u00a0I Shuman, Sunil\u00a0K Narang, Pascal Frossard, Antonio Ortega, and Pierre Vandergheynst. 2013. The emerging field of signal processing on graphs: Extending high-dimensional data analysis to networks and other irregular domains. IEEE signal processing magazine 30, 3 (2013), 83\u201398."},{"key":"e_1_3_2_1_38_1","doi-asserted-by":"publisher","DOI":"10.1109\/ICCV48922.2021.00986"},{"key":"e_1_3_2_1_39_1","unstructured":"Alexey Dosovitskiy Lucas Beyer Alexander Kolesnikov Dirk Weissenborn Xiaohua Zhai Thomas Unterthiner Mostafa Dehghani Matthias Minderer Georg Heigold Sylvain Gelly 2020. An image is worth 16x16 words: Transformers for image recognition at scale. arXiv preprint arXiv:2010.11929(2020).  Alexey Dosovitskiy Lucas Beyer Alexander Kolesnikov Dirk Weissenborn Xiaohua Zhai Thomas Unterthiner Mostafa Dehghani Matthias Minderer Georg Heigold Sylvain Gelly 2020. An image is worth 16x16 words: Transformers for image recognition at scale. arXiv preprint arXiv:2010.11929(2020)."},{"key":"e_1_3_2_1_40_1","volume-title":"International conference on learning representations.","author":"Hellendoorn J","year":"2019","unstructured":"Vincent\u00a0 J Hellendoorn , Charles Sutton , Rishabh Singh , Petros Maniatis , and David Bieber . 2019 . Global relational models of source code . In International conference on learning representations. Vincent\u00a0J Hellendoorn, Charles Sutton, Rishabh Singh, Petros Maniatis, and David Bieber. 2019. Global relational models of source code. In International conference on learning representations."},{"key":"e_1_3_2_1_41_1","volume-title":"International Conference on Machine Learning. PMLR, 8476\u20138486","author":"Peng Dinglan","year":"2021","unstructured":"Dinglan Peng , Shuxin Zheng , Yatao Li , Guolin Ke , Di He , and Tie-Yan Liu . 2021 . How could Neural Networks understand Programs? . In International Conference on Machine Learning. PMLR, 8476\u20138486 . Dinglan Peng, Shuxin Zheng, Yatao Li, Guolin Ke, Di He, and Tie-Yan Liu. 2021. How could Neural Networks understand Programs?. In International Conference on Machine Learning. PMLR, 8476\u20138486."},{"key":"e_1_3_2_1_42_1","unstructured":"Daniel Z\u00fcgner Tobias Kirschstein Michele Catasta Jure Leskovec and Stephan G\u00fcnnemann. 2021. Language-agnostic representation learning of source code from structure and context. arXiv preprint arXiv:2103.11318(2021).  Daniel Z\u00fcgner Tobias Kirschstein Michele Catasta Jure Leskovec and Stephan G\u00fcnnemann. 2021. Language-agnostic representation learning of source code from structure and context. arXiv preprint arXiv:2103.11318(2021)."},{"key":"e_1_3_2_1_43_1","volume-title":"Long short-term memory. Neural computation 9, 8","author":"Hochreiter Sepp","year":"1997","unstructured":"Sepp Hochreiter and J\u00fcrgen Schmidhuber . 1997. Long short-term memory. Neural computation 9, 8 ( 1997 ), 1735\u20131780. Sepp Hochreiter and J\u00fcrgen Schmidhuber. 1997. Long short-term memory. Neural computation 9, 8 (1997), 1735\u20131780."},{"key":"e_1_3_2_1_44_1","unstructured":"Junyoung Chung Caglar Gulcehre KyungHyun Cho and Yoshua Bengio. 2014. Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv preprint arXiv:1412.3555(2014).  Junyoung Chung Caglar Gulcehre KyungHyun Cho and Yoshua Bengio. 2014. Empirical evaluation of gated recurrent neural networks on sequence modeling. arXiv preprint arXiv:1412.3555(2014)."},{"volume-title":"Modeling financial time series with S-PLUS. Vol.\u00a02","author":"Zivot Eric","key":"e_1_3_2_1_45_1","unstructured":"Eric Zivot and Jiahui Wang . 2006. Modeling financial time series with S-PLUS. Vol.\u00a02 . Springer . Eric Zivot and Jiahui Wang. 2006. Modeling financial time series with S-PLUS. Vol.\u00a02. Springer."},{"key":"e_1_3_2_1_46_1","doi-asserted-by":"publisher","DOI":"10.3141\/1748-12"},{"key":"e_1_3_2_1_47_1","doi-asserted-by":"publisher","DOI":"10.1016\/j.ijforecast.2006.03.001"}],"event":{"name":"BDSIC 2022: 2022 4th International Conference on Big-data Service and Intelligent Computation","acronym":"BDSIC 2022","location":"Xiamen China"},"container-title":["Proceedings of the 2022 4th International Conference on Big-data Service and Intelligent Computation"],"original-title":[],"link":[{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3578339.3578341","content-type":"unspecified","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/dl.acm.org\/doi\/pdf\/10.1145\/3578339.3578341","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2025,6,17]],"date-time":"2025-06-17T16:47:06Z","timestamp":1750178826000},"score":1,"resource":{"primary":{"URL":"https:\/\/dl.acm.org\/doi\/10.1145\/3578339.3578341"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2022,11,25]]},"references-count":47,"alternative-id":["10.1145\/3578339.3578341","10.1145\/3578339"],"URL":"https:\/\/doi.org\/10.1145\/3578339.3578341","relation":{},"subject":[],"published":{"date-parts":[[2022,11,25]]},"assertion":[{"value":"2023-03-13","order":2,"name":"published","label":"Published","group":{"name":"publication_history","label":"Publication History"}}]}}