{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,1,25]],"date-time":"2026-01-25T18:44:08Z","timestamp":1769366648040,"version":"3.49.0"},"reference-count":40,"publisher":"Springer Science and Business Media LLC","issue":"2","license":[{"start":{"date-parts":[[2023,3,22]],"date-time":"2023-03-22T00:00:00Z","timestamp":1679443200000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"},{"start":{"date-parts":[[2023,3,22]],"date-time":"2023-03-22T00:00:00Z","timestamp":1679443200000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"}],"funder":[{"name":"National Key Research and Development Program of China","award":["2018YFB0704301-1"],"award-info":[{"award-number":["2018YFB0704301-1"]}]},{"DOI":"10.13039\/501100001809","name":"National Natural Science Foundation of China","doi-asserted-by":"publisher","award":["61972268"],"award-info":[{"award-number":["61972268"]}],"id":[{"id":"10.13039\/501100001809","id-type":"DOI","asserted-by":"publisher"}]},{"name":"Med-X Center for Informatics Funding Project","award":["YGJC001"],"award-info":[{"award-number":["YGJC001"]}]}],"content-domain":{"domain":["link.springer.com"],"crossmark-restriction":false},"short-container-title":["Data Sci. Eng."],"published-print":{"date-parts":[[2023,6]]},"abstract":"<jats:title>Abstract<\/jats:title><jats:p>Temporal heterogeneous graphs can model lots of complex systems in the real world, such as social networks and e-commerce applications, which are naturally time-varying and heterogeneous. As most existing graph representation learning methods cannot efficiently handle both of these characteristics, we propose a Transformer-like representation learning model, named THAN, to learn low-dimensional node embeddings preserving the topological structure features, heterogeneous semantics, and dynamic patterns of temporal heterogeneous graphs, simultaneously. Specifically, THAN first samples heterogeneous neighbors with temporal constraints and projects node features into the same vector space, then encodes time information and aggregates the neighborhood influence in different weights via type-aware self-attention. To capture long-term dependencies and evolutionary patterns, we design an optional memory module for storing and evolving dynamic node representations. Experiments on three real-world datasets demonstrate that THAN outperforms the state-of-the-arts in terms of effectiveness with respect to the temporal link prediction task.<\/jats:p>","DOI":"10.1007\/s41019-023-00207-w","type":"journal-article","created":{"date-parts":[[2023,3,22]],"date-time":"2023-03-22T11:03:42Z","timestamp":1679483022000},"page":"98-111","update-policy":"https:\/\/doi.org\/10.1007\/springer_crossmark_policy","source":"Crossref","is-referenced-by-count":16,"title":["Memory-Enhanced Transformer for Representation Learning on Temporal Heterogeneous Graphs"],"prefix":"10.1007","volume":"8","author":[{"given":"Longhai","family":"Li","sequence":"first","affiliation":[]},{"given":"Lei","family":"Duan","sequence":"additional","affiliation":[]},{"given":"Junchen","family":"Wang","sequence":"additional","affiliation":[]},{"given":"Chengxin","family":"He","sequence":"additional","affiliation":[]},{"given":"Zihao","family":"Chen","sequence":"additional","affiliation":[]},{"given":"Guicai","family":"Xie","sequence":"additional","affiliation":[]},{"given":"Song","family":"Deng","sequence":"additional","affiliation":[]},{"given":"Zhaohang","family":"Luo","sequence":"additional","affiliation":[]}],"member":"297","published-online":{"date-parts":[[2023,3,22]]},"reference":[{"key":"207_CR1","unstructured":"Kipf TN, Welling M (2016) Variational graph auto-encoders. CoRR arXiv:1611.07308"},{"key":"207_CR2","doi-asserted-by":"crossref","unstructured":"Schlichtkrull MS, Kipf TN, Bloem P, van den Berg R, Titov I, Welling M (2018) Modeling relational data with graph convolutional networks. In: Proceedings of the 15th international conference on semantic web, vol 10843, pp 593\u2013607","DOI":"10.1007\/978-3-319-93417-4_38"},{"key":"207_CR3","doi-asserted-by":"publisher","first-page":"45","DOI":"10.1016\/j.ymeth.2021.10.006","volume":"198","author":"C He","year":"2022","unstructured":"He C, Duan L, Zheng H, Li-Ling J, Song L, Li L (2022) Graph convolutional network approach to discovering disease-related CIRCRNA\u2013MIRNA\u2013MRNA axes. Methods 198:45\u201355","journal-title":"Methods"},{"key":"207_CR4","unstructured":"Kipf TN, Welling M (2017) Semi-supervised classification with graph convolutional networks. In: Proceedings of the 5th international conference on learning representations"},{"key":"207_CR5","unstructured":"Velickovic P, Cucurull G, Casanova A, Romero A, Li\u00f2 P, Bengio Y (2018) Graph attention networks. In: Proceedings of the 6th international conference on learning representations"},{"key":"207_CR6","unstructured":"Gilmer J, Schoenholz SS, Riley PF, Vinyals O, Dahl GE (2017) Neural message passing for quantum chemistry. In: Proceedings of the 34th international conference on machine learning, vol 70, pp 1263\u20131272"},{"key":"207_CR7","unstructured":"Ying Z, You J, Morris C, Ren X, Hamilton WL, Leskovec J (2018) Hierarchical graph representation learning with differentiable pooling. In: Proceedings of the 32nd international conference on neural information processing systems, pp 4805\u20134815"},{"key":"207_CR8","doi-asserted-by":"publisher","first-page":"57","DOI":"10.1007\/s41019-021-00174-0","volume":"7","author":"S Tuteja","year":"2022","unstructured":"Tuteja S, Kumar R (2022) A unification of heterogeneous data sources into a graph model in e-commerce. Data Sci Eng 7:57\u201370","journal-title":"Data Sci Eng"},{"key":"207_CR9","doi-asserted-by":"crossref","unstructured":"Dong Y, Chawla NV, Swami A (2017) metapath2vec: scalable representation learning for heterogeneous networks. In: Proceedings of the 23rd ACM SIGKDD international conference on knowledge discovery and data mining, pp 135\u2013144","DOI":"10.1145\/3097983.3098036"},{"key":"207_CR10","doi-asserted-by":"crossref","unstructured":"Fu T, Lee W, Lei Z (2017) Hin2vec: explore meta-paths in heterogeneous information networks for representation learning. In: Proceedings of the 2017 ACM on conference on information and knowledge management, pp 1797\u20131806","DOI":"10.1145\/3132847.3132953"},{"key":"207_CR11","doi-asserted-by":"crossref","unstructured":"Hu Z, Dong Y, Wang K, Sun Y (2020) Heterogeneous graph transformer. In: Proceedings of the 29th international conference on world wide web, pp 2704\u20132710","DOI":"10.1145\/3366423.3380027"},{"key":"207_CR12","doi-asserted-by":"crossref","unstructured":"Wang X, Ji H, Shi C, Wang B, Ye Y, Cui P, Yu PS (2019) Heterogeneous graph attention network. In: Proceedings of the 28th international conference on world wide web, pp 2022\u20132032","DOI":"10.1145\/3308558.3313562"},{"key":"207_CR13","doi-asserted-by":"crossref","unstructured":"Zhao J, Wang X, Shi C, Hu B, Song G, Ye Y (2021) Heterogeneous graph structure learning for graph neural networks. In: Proceedings of the 35th AAAI conference on artificial intelligence, pp 4697\u20134705","DOI":"10.1609\/aaai.v35i5.16600"},{"key":"207_CR14","first-page":"70","volume":"21","author":"SM Kazemi","year":"2020","unstructured":"Kazemi SM, Goel R, Jain K, Kobyzev I, Sethi A, Forsyth P, Poupart P (2020) Representation learning for dynamic graphs: a survey. J Mach Learn Res 21:70\u201317073","journal-title":"J Mach Learn Res"},{"key":"207_CR15","doi-asserted-by":"crossref","unstructured":"Fan Y, Ju M, Zhang C, Ye Y (2022) Heterogeneous temporal graph neural network. In: Proceedings of the 2022 SIAM international conference on data mining, pp 657\u2013665","DOI":"10.1137\/1.9781611977172.74"},{"key":"207_CR16","doi-asserted-by":"crossref","unstructured":"Pareja A, Domeniconi G, Chen J, Ma T, Suzumura T, Kanezashi H, Kaler T, Schardl TB, Leiserson CE (2020) Evolvegcn: evolving graph convolutional networks for dynamic graphs. In: Proceedings of the 34th AAAI conference on artificial intelligence, pp 5363\u20135370","DOI":"10.1609\/aaai.v34i04.5984"},{"key":"207_CR17","doi-asserted-by":"crossref","unstructured":"Sankar A, Wu Y, Gou L, Zhang W, Yang H (2020) Dysat: deep neural representation learning on dynamic graphs via self-attention networks. In: Proceedings of the 13th international conference on web search and data mining, pp 519\u2013527","DOI":"10.1145\/3336191.3371845"},{"key":"207_CR18","doi-asserted-by":"crossref","unstructured":"Xue H, Yang L, Jiang W, Wei Y, Hu Y, Lin Y (2020) Modeling dynamic heterogeneous network for link prediction using hierarchical attention with temporal RNN. In: Proceedings of the 2020 European conference on machine learning and knowledge discovery in databases, vol 12457, pp 282\u2013298","DOI":"10.1007\/978-3-030-67658-2_17"},{"key":"207_CR19","doi-asserted-by":"crossref","unstructured":"Huang H, Shi R, Zhou W, Wang X, Jin H, Fu X (2021) Temporal heterogeneous information network embedding. In: Proceedings of the 30th international joint conference on artificial intelligence, pp 1470\u20131476","DOI":"10.24963\/ijcai.2021\/203"},{"key":"207_CR20","doi-asserted-by":"crossref","unstructured":"Ji Y, Jia T, Fang Y, Shi C (2021) Dynamic heterogeneous graph embedding via heterogeneous hawkes process. In: Proceedings of the 2021 European conference on machine learning and knowledge discovery in databases, vol 12975, pp 388\u2013403","DOI":"10.1007\/978-3-030-86486-6_24"},{"key":"207_CR21","doi-asserted-by":"crossref","unstructured":"Kumar S, Zhang X, Leskovec J (2019) Predicting dynamic embedding trajectory in temporal interaction networks. In: Proceedings of the 25th ACM SIGKDD international conference on knowledge discovery and data mining, pp 1269\u20131278","DOI":"10.1145\/3292500.3330895"},{"key":"207_CR22","unstructured":"Wang Y, Chang Y, Liu Y, Leskovec J, Li P (2021) Inductive representation learning in temporal networks via causal anonymous walks. In: Proceedings of the 9th international conference on learning representations"},{"key":"207_CR23","unstructured":"Xu D, Ruan C, K\u00f6rpeoglu E, Kumar S, Achan K (2020) Inductive representation learning on temporal graphs. In: Proceedings of the 8th international conference on learning representations"},{"key":"207_CR24","unstructured":"Rossi E, Chamberlain B, Frasca F, Eynard D, Monti F, Bronstein MM (2020) Temporal graph networks for deep learning on dynamic graphs. CoRR arXiv:2006.10637"},{"key":"207_CR25","unstructured":"Hamilton WL, Ying Z, Leskovec J (2017) Inductive representation learning on large graphs. In: Proceedings of the 31st international conference on neural information processing systems, pp 1024\u20131034"},{"key":"207_CR26","doi-asserted-by":"crossref","unstructured":"Grover A, Leskovec J (2016) node2vec: Scalable feature learning for networks. In: Proceedings of the 22nd ACM SIGKDD international conference on knowledge discovery and data mining, pp 855\u2013864","DOI":"10.1145\/2939672.2939754"},{"key":"207_CR27","doi-asserted-by":"crossref","unstructured":"Perozzi B, Al-Rfou R, Skiena S (2014) Deepwalk: online learning of social representations. In: Proceedings of the 20th ACM SIGKDD international conference on knowledge discovery and data mining, pp 701\u2013710","DOI":"10.1145\/2623330.2623732"},{"key":"207_CR28","doi-asserted-by":"publisher","first-page":"156","DOI":"10.1007\/s41019-022-00184-6","volume":"7","author":"J Luo","year":"2022","unstructured":"Luo J, Xiao S, Jiang S, Gao H, Xiao Y (2022) ripple2vec: node embedding with ripple distance of structures. Data Sci Eng 7:156\u2013174","journal-title":"Data Sci Eng"},{"key":"207_CR29","doi-asserted-by":"crossref","unstructured":"You J, Du T, Leskovec J (2022) ROLAND: graph learning framework for dynamic graphs. In: Proceedings of the 28th ACM SIGKDD international conference on knowledge discovery and data mining, pp 2358\u20132366","DOI":"10.1145\/3534678.3539300"},{"key":"207_CR30","unstructured":"Trivedi R, Farajtabar M, Biswal P, Zha H (2019) Dyrep: learning representations over dynamic graphs. In: Proceedings of the 7th international conference on learning representations"},{"key":"207_CR31","unstructured":"Vaswani A, Shazeer N, Parmar N, Uszkoreit J, Jones L, Gomez AN, Kaiser L, Polosukhin I (2017) Attention is all you need. In: Proceedings of the 31st international conference on neural information processing systems, pp 5998\u20136008"},{"key":"207_CR32","unstructured":"Yun S, Jeong M, Kim R, Kang J, Kim HJ (2019) Graph transformer networks. In: Proceedings of the 33rd international conference on neural information processing systems, pp 11960\u201311970"},{"key":"207_CR33","unstructured":"Ying C, Cai T, Luo S, Zheng S, Ke G, He D, Shen Y, Liu T (2021) Do transformers really perform badly for graph representation? In: Proceedings of the 35th international conference on neural information processing systems, pp 28877\u201328888"},{"key":"207_CR34","doi-asserted-by":"crossref","unstructured":"Ji G, He S, Xu L, Liu K, Zhao J (2015) Knowledge graph embedding via dynamic mapping matrix. In: Proceedings of the 53rd annual meeting of the association for computational linguistics, pp 687\u2013696","DOI":"10.3115\/v1\/P15-1067"},{"key":"207_CR35","unstructured":"Ba LJ, Kiros JR, Hinton GE (2016) Layer normalization. CoRR arXiv:1607.06450"},{"key":"207_CR36","unstructured":"Kazemi SM, Goel R, Eghbali S, Ramanan J, Sahota J, Thakur S, Wu S, Smyth C, Poupart P, Brubaker M (2019) Time2vec: learning a vector representation of time. CoRR arXiv:1907.05321"},{"key":"207_CR37","unstructured":"Xu D, Ruan C, K\u00f6rpeoglu E, Kumar S, Achan K (2019) Self-attention with functional time representation learning. In: Proceedings of the 33rd international conference on neural information processing systems, pp 15889\u201315899"},{"issue":"8","key":"207_CR38","doi-asserted-by":"publisher","first-page":"1735","DOI":"10.1162\/neco.1997.9.8.1735","volume":"9","author":"S Hochreiter","year":"1997","unstructured":"Hochreiter S, Schmidhuber J (1997) Long short-term memory. Neural Comput 9(8):1735\u20131780","journal-title":"Neural Comput"},{"key":"207_CR39","doi-asserted-by":"crossref","unstructured":"Cho K, van Merrienboer B, G\u00fcl\u00e7ehre \u00c7, Bahdanau D, Bougares F, Schwenk H, Bengio Y (2014) Learning phrase representations using RNN encoder-decoder for statistical machine translation. In: Proceedings of the 2004 conference on empirical methods in natural language processing, pp 1724\u20131734","DOI":"10.3115\/v1\/D14-1179"},{"key":"207_CR40","unstructured":"Fey M, Lenssen JE (2019) Fast graph representation learning with pytorch geometric. CoRR arXiv:1903.02428"}],"container-title":["Data Science and Engineering"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s41019-023-00207-w.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/article\/10.1007\/s41019-023-00207-w\/fulltext.html","content-type":"text\/html","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s41019-023-00207-w.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2023,5,15]],"date-time":"2023-05-15T14:45:04Z","timestamp":1684161904000},"score":1,"resource":{"primary":{"URL":"https:\/\/link.springer.com\/10.1007\/s41019-023-00207-w"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2023,3,22]]},"references-count":40,"journal-issue":{"issue":"2","published-print":{"date-parts":[[2023,6]]}},"alternative-id":["207"],"URL":"https:\/\/doi.org\/10.1007\/s41019-023-00207-w","relation":{},"ISSN":["2364-1185","2364-1541"],"issn-type":[{"value":"2364-1185","type":"print"},{"value":"2364-1541","type":"electronic"}],"subject":[],"published":{"date-parts":[[2023,3,22]]},"assertion":[{"value":"4 December 2022","order":1,"name":"received","label":"Received","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"7 February 2023","order":2,"name":"revised","label":"Revised","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"1 March 2023","order":3,"name":"accepted","label":"Accepted","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"22 March 2023","order":4,"name":"first_online","label":"First Online","group":{"name":"ArticleHistory","label":"Article History"}},{"order":1,"name":"Ethics","group":{"name":"EthicsHeading","label":"Declarations"}},{"value":"We would like to submit the enclosed manuscript entitled \u201cMemory-Enhanced Transformer for Representation Learning on Temporal Heterogeneous Graphs\u201d, which we wish to be considered for publication in Data Science and Engineering. No conflict of interest exits in the submission of this manuscript, and manuscript is approved by all authors for publication. I would like to declare on behalf of my co-authors that the work described was original research that has not been published previously, and not under consideration for publication elsewhere, in whole or in part. All the authors listed have approved the manuscript that is enclosed.","order":2,"name":"Ethics","group":{"name":"EthicsHeading","label":"Conflicts of interest"}}]}}