{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,10,12]],"date-time":"2025-10-12T04:57:34Z","timestamp":1760245054247},"publisher-location":"California","reference-count":0,"publisher":"International Joint Conferences on Artificial Intelligence Organization","content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":[],"published-print":{"date-parts":[[2019,8]]},"abstract":"<jats:p>In this paper, we present a general framework to scale graph autoencoders (AE) and graph variational autoencoders (VAE). This framework leverages graph degeneracy concepts to train models only from a dense subset of nodes instead of using the entire graph. Together with a simple yet effective propagation mechanism, our approach significantly improves scalability and training speed while preserving performance. We evaluate and discuss our method on several variants of existing graph AE and VAE, providing the first application of these models to large graphs with up to millions of nodes and edges. We achieve empirically competitive results w.r.t. several popular scalable node embedding methods, which emphasizes the relevance of pursuing further research towards more scalable graph AE and VAE.<\/jats:p>","DOI":"10.24963\/ijcai.2019\/465","type":"proceedings-article","created":{"date-parts":[[2019,7,28]],"date-time":"2019-07-28T03:46:05Z","timestamp":1564285565000},"page":"3353-3359","source":"Crossref","is-referenced-by-count":15,"title":["A Degeneracy Framework for Scalable Graph Autoencoders"],"prefix":"10.24963","author":[{"given":"Guillaume","family":"Salha","sequence":"first","affiliation":[{"name":"Deezer Research & Development, Paris, France"},{"name":"\u00c9cole Polytechnique, Palaiseau, France"}]},{"given":"Romain","family":"Hennequin","sequence":"additional","affiliation":[{"name":"Deezer Research & Development, Paris, France"}]},{"given":"Viet Anh","family":"Tran","sequence":"additional","affiliation":[{"name":"Deezer Research & Development, Paris, France"}]},{"given":"Michalis","family":"Vazirgiannis","sequence":"additional","affiliation":[{"name":"Ecole Polytechnique, Palaiseau, France"}]}],"member":"10584","event":{"number":"28","sponsor":["International Joint Conferences on Artificial Intelligence Organization (IJCAI)"],"acronym":"IJCAI-2019","name":"Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}","start":{"date-parts":[[2019,8,10]]},"theme":"Artificial Intelligence","location":"Macao, China","end":{"date-parts":[[2019,8,16]]}},"container-title":["Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence"],"original-title":[],"deposited":{"date-parts":[[2019,7,28]],"date-time":"2019-07-28T03:49:30Z","timestamp":1564285770000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.ijcai.org\/proceedings\/2019\/465"}},"subtitle":[],"proceedings-subject":"Artificial Intelligence Research Articles","short-title":[],"issued":{"date-parts":[[2019,8]]},"references-count":0,"URL":"https:\/\/doi.org\/10.24963\/ijcai.2019\/465","relation":{},"subject":[],"published":{"date-parts":[[2019,8]]}}}