We introduce a tunable GAN, called -GAN, parameterized by , which interpolates between various -GANs and Integral Probability Metric based GANs (under constrained discriminator set). We construct -GAN using a supervised loss function, namely, -loss, which is a tunable loss function capturing several canonical losses. We show that -GAN is intimately related to the Arimoto divergence, which was first proposed by \"{O}sterriecher (1996), and later studied by Liese and Vajda (2006). We also study the convergence properties of -GAN. We posit that the holistic understanding that -GAN introduces will have practical benefits of addressing both the issues of vanishing gradients and mode collapse.
View on arXiv