36
0

Quantization-based Bounds on the Wasserstein Metric

Main:9 Pages
8 Figures
Bibliography:3 Pages
7 Tables
Appendix:11 Pages
Abstract

The Wasserstein metric has become increasingly important in many machine learning applications such as generative modeling, image retrieval and domain adaptation. Despite its appeal, it is often too costly to compute. This has motivated approximation methods like entropy-regularized optimal transport, downsampling, and subsampling, which trade accuracy for computational efficiency. In this paper, we consider the challenge of computing efficient approximations to the Wasserstein metric that also serve as strict upper or lower bounds. Focusing on discrete measures on regular grids, our approach involves formulating and exactly solving a Kantorovich problem on a coarse grid using a quantized measure and specially designed cost matrix, followed by an upscaling and correction stage. This is done either in the primal or dual space to obtain valid upper and lower bounds on the Wasserstein metric of the full-resolution inputs. We evaluate our methods on the DOTmark optimal transport images benchmark, demonstrating a 10x-100x speedup compared to entropy-regularized OT while keeping the approximation error below 2%.

View on arXiv
@article{bobrutsky2025_2506.00976,
  title={ Quantization-based Bounds on the Wasserstein Metric },
  author={ Jonathan Bobrutsky and Amit Moscovich },
  journal={arXiv preprint arXiv:2506.00976},
  year={ 2025 }
}
Comments on this paper