ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1902.03642
14
15

(q,p)-Wasserstein GANs: Comparing Ground Metrics for Wasserstein GANs

10 February 2019
Anton Mallasto
J. Frellsen
Wouter Boomsma
Aasa Feragen
ArXivPDFHTML
Abstract

Generative Adversial Networks (GANs) have made a major impact in computer vision and machine learning as generative models. Wasserstein GANs (WGANs) brought Optimal Transport (OT) theory into GANs, by minimizing the 111-Wasserstein distance between model and data distributions as their objective function. Since then, WGANs have gained considerable interest due to their stability and theoretical framework. We contribute to the WGAN literature by introducing the family of (q,p)(q,p)(q,p)-Wasserstein GANs, which allow the use of more general ppp-Wasserstein metrics for p≥1p\geq 1p≥1 in the GAN learning procedure. While the method is able to incorporate any cost function as the ground metric, we focus on studying the lql^qlq metrics for q≥1q\geq 1q≥1. This is a notable generalization as in the WGAN literature the OT distances are commonly based on the l2l^2l2 ground metric. We demonstrate the effect of different ppp-Wasserstein distances in two toy examples. Furthermore, we show that the ground metric does make a difference, by comparing different (q,p)(q,p)(q,p) pairs on the MNIST and CIFAR-10 datasets. Our experiments demonstrate that changing the ground metric and ppp can notably improve on the common (q,p)=(2,1)(q,p) = (2,1)(q,p)=(2,1) case.

View on arXiv
Comments on this paper