{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2026,3,31]],"date-time":"2026-03-31T07:21:28Z","timestamp":1774941688902,"version":"3.50.1"},"publisher-location":"California","reference-count":0,"publisher":"International Joint Conferences on Artificial Intelligence Organization","content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":[],"published-print":{"date-parts":[[2019,8]]},"abstract":"<jats:p>We propose a new formulation for learning generative adversarial networks (GANs) using optimal transport cost (the general form of Wasserstein distance)\u00a0as the objective criterion to measure the dissimilarity between target distribution and learned distribution. Our formulation is based on the general form of the Kantorovich duality which is applicable to optimal transport with a wide range of cost functions that are not necessarily metric. To make optimising this duality form amenable to gradient-based methods, we employ a function that acts as an amortised optimiser for the innermost optimisation problem. Interestingly, the amortised optimiser can be viewed as a mover since it strategically shifts around data points. The resulting formulation is a sequential min-max-min game with 3 players: the generator, the critic, and the mover where the new player, the mover, attempts to fool the critic by shifting the data around. Despite involving three players, we demonstrate that our proposed formulation can be trained reasonably effectively via a simple alternative gradient learning strategy. Compared with the existing Lipschitz-constrained formulations of Wasserstein GAN on CIFAR-10, our model yields significantly better diversity scores than weight clipping and comparable performance to gradient penalty method.<\/jats:p>","DOI":"10.24963\/ijcai.2019\/305","type":"proceedings-article","created":{"date-parts":[[2019,7,28]],"date-time":"2019-07-28T03:46:05Z","timestamp":1564285565000},"page":"2202-2208","source":"Crossref","is-referenced-by-count":3,"title":["Three-Player Wasserstein GAN via Amortised Duality"],"prefix":"10.24963","author":[{"given":"Nhan","family":"Dam","sequence":"first","affiliation":[{"name":"Monash University"}]},{"given":"Quan","family":"Hoang","sequence":"additional","affiliation":[{"name":"Monash University"}]},{"given":"Trung","family":"Le","sequence":"additional","affiliation":[{"name":"Monash University"}]},{"given":"Tu Dinh","family":"Nguyen","sequence":"additional","affiliation":[{"name":"Monash University"}]},{"given":"Hung","family":"Bui","sequence":"additional","affiliation":[{"name":"Google DeepMind"}]},{"given":"Dinh","family":"Phung","sequence":"additional","affiliation":[{"name":"Monash University"}]}],"member":"10584","event":{"name":"Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}","theme":"Artificial Intelligence","location":"Macao, China","acronym":"IJCAI-2019","number":"28","sponsor":["International Joint Conferences on Artificial Intelligence Organization (IJCAI)"],"start":{"date-parts":[[2019,8,10]]},"end":{"date-parts":[[2019,8,16]]}},"container-title":["Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence"],"original-title":[],"deposited":{"date-parts":[[2019,7,28]],"date-time":"2019-07-28T03:48:25Z","timestamp":1564285705000},"score":1,"resource":{"primary":{"URL":"https:\/\/www.ijcai.org\/proceedings\/2019\/305"}},"subtitle":[],"proceedings-subject":"Artificial Intelligence Research Articles","short-title":[],"issued":{"date-parts":[[2019,8]]},"references-count":0,"URL":"https:\/\/doi.org\/10.24963\/ijcai.2019\/305","relation":{},"subject":[],"published":{"date-parts":[[2019,8]]}}}