A Smoothed Dual Approach for Variational Wasserstein Problems
Variational problems that involve Wasserstein distances have been recently proposed as a mean to summarize and learn from probability measures. Despite being conceptually simple, such problems are computationally challenging because they involve minimizing over quantities (Wasserstein distances) that are themselves hard to compute. We show that the dual formulation of Wasserstein variational problems introduced recently by Carlier et al. (2014) can be regularized using an entropic smoothing, which leads to smooth, differentiable, convex optimization problems that are simpler to implement and numerically more stable. In addition to such favorable properties, we propose a simple and effective heuristic to initialize variables with that formulation. We illustrate the versatility of our smoothed dual formulation by applying it to the computation of Wasserstein barycenters and by carrying out dictionary learning on a dataset of histograms using the Wasserstein distance as the fitting error.
View on arXiv