-Divergences: Interpolating between -Divergences and Integral Probability Metrics

We develop a rigorous and general framework for constructing information-theoretic divergences that subsume both -divergences and integral probability metrics (IPMs), such as the -Wasserstein distance. We prove under which assumptions these divergences, hereafter referred to as -divergences, provide a notion of `distance' between probability measures and show that they can be expressed as a two-stage mass-redistribution/mass-transport process. The -divergences inherit features from IPMs, such as the ability to compare distributions which are not absolutely continuous, as well as from -divergences, namely the strict concavity of their variational representations and the ability to control heavy-tailed distributions for particular choices of . When combined, these features establish a divergence with improved properties for estimation, statistical learning, and uncertainty quantification applications. Using statistical learning as an example, we demonstrate their advantage in training generative adversarial networks (GANs) for heavy-tailed, not-absolutely continuous sample distributions. We also show improved performance and stability over gradient-penalized Wasserstein GAN in image generation.
View on arXiv