{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2024,10,18]],"date-time":"2024-10-18T04:29:15Z","timestamp":1729225755351,"version":"3.27.0"},"reference-count":0,"publisher":"IOS Press","isbn-type":[{"value":"9781643685489","type":"electronic"}],"license":[{"start":{"date-parts":[[2024,10,16]],"date-time":"2024-10-16T00:00:00Z","timestamp":1729036800000},"content-version":"unspecified","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by-nc\/4.0\/"}],"content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":[],"published-print":{"date-parts":[[2024,10,16]]},"abstract":"<jats:p>Mini-batch Graph Transformer\u00a0(MGT), as an emerging graph learning model, has demonstrated significant advantages in semi-supervised node prediction tasks with improved computational efficiency and enhanced model robustness. However, existing methods for processing local information either rely on sampling or simple aggregation, which respectively result in the loss and squashing of critical neighbor information. Moreover, the limited number of nodes in each mini-batch restricts the model\u2019s capacity to capture the global characteristic of the graph. In this paper, we propose LGMformer, a novel MGT model that employs a two-stage augmented interaction strategy, transitioning from local to global perspectives, to address the aforementioned bottlenecks. The local interaction augmentation\u00a0(LIA) presents a neighbor-target interaction Transformer (NTIformer) to acquire an insightful understanding of the co-interaction patterns between neighbors and the target node, resulting in a locally effective token list that serves as input for the MGT. In contrast, global interaction augmentation\u00a0(GIA) adopts a cross-attention mechanism to incorporate entire graph prototypes into the target node representation, thereby compensating for the global graph information to ensure a more comprehensive perception. To this end, LGMformer achieves the enhancement of node representations under the MGT paradigm. Experimental results related to node classification on the ten benchmark datasets demonstrate the effectiveness of the proposed method. Our code is available at https:\/\/github.com\/l-wd\/LGMformer.<\/jats:p>","DOI":"10.3233\/faia240842","type":"book-chapter","created":{"date-parts":[[2024,10,17]],"date-time":"2024-10-17T13:31:54Z","timestamp":1729171914000},"source":"Crossref","is-referenced-by-count":0,"title":["Learning a Mini-Batch Graph Transformer via Two-Stage Interaction Augmentation"],"prefix":"10.3233","author":[{"given":"Wenda","family":"Li","sequence":"first","affiliation":[{"name":"State Key Laboratory of Blockchain and Security, Zhejiang University"},{"name":"School of Software Technology, Zhejiang University"},{"name":"Hangzhou High-Tech Zone (Binjiang) Institute of Blockchain and Data Security"}]},{"given":"Kaixuan","family":"Chen","sequence":"additional","affiliation":[{"name":"State Key Laboratory of Blockchain and Security, Zhejiang University"},{"name":"Hangzhou High-Tech Zone (Binjiang) Institute of Blockchain and Data Security"}]},{"given":"Shunyu","family":"Liu","sequence":"additional","affiliation":[{"name":"State Key Laboratory of Blockchain and Security, Zhejiang University"},{"name":"Hangzhou High-Tech Zone (Binjiang) Institute of Blockchain and Data Security"}]},{"given":"Tongya","family":"Zheng","sequence":"additional","affiliation":[{"name":"Big Graph Center, School of Computer and Computing Science, Hangzhou City University"},{"name":"College of Computer Science and Technology, Zhejiang University Hangzhou, China"}]},{"given":"Wenjie","family":"Huang","sequence":"additional","affiliation":[{"name":"State Key Laboratory of Blockchain and Security, Zhejiang University"},{"name":"Hangzhou High-Tech Zone (Binjiang) Institute of Blockchain and Data Security"}]},{"given":"Mingli","family":"Song","sequence":"additional","affiliation":[{"name":"State Key Laboratory of Blockchain and Security, Zhejiang University"},{"name":"Hangzhou High-Tech Zone (Binjiang) Institute of Blockchain and Data Security"}]}],"member":"7437","container-title":["Frontiers in Artificial Intelligence and Applications","ECAI 2024"],"original-title":[],"link":[{"URL":"https:\/\/ebooks.iospress.nl\/pdf\/doi\/10.3233\/FAIA240842","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2024,10,17]],"date-time":"2024-10-17T13:31:54Z","timestamp":1729171914000},"score":1,"resource":{"primary":{"URL":"https:\/\/ebooks.iospress.nl\/doi\/10.3233\/FAIA240842"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2024,10,16]]},"ISBN":["9781643685489"],"references-count":0,"URL":"https:\/\/doi.org\/10.3233\/faia240842","relation":{},"ISSN":["0922-6389","1879-8314"],"issn-type":[{"value":"0922-6389","type":"print"},{"value":"1879-8314","type":"electronic"}],"subject":[],"published":{"date-parts":[[2024,10,16]]}}}