We develop a theory of finite-dimensional polyhedral subsets over the Wasserstein space and optimization of functionals over them via first-order methods. Our main application is to the problem of mean-field variational inference, which seeks to approximate a distribution over by a product measure . When is strongly log-concave and log-smooth, we provide (1) approximation rates certifying that is close to the minimizer of the KL divergence over a \emph{polyhedral} set , and (2) an algorithm for minimizing over based on accelerated gradient descent over . As a byproduct of our analysis, we obtain the first end-to-end analysis for gradient-based algorithms for MFVI.
View on arXiv@article{jiang2025_2312.02849, title={ Algorithms for mean-field variational inference via polyhedral optimization in the Wasserstein space }, author={ Yiheng Jiang and Sinho Chewi and Aram-Alexandre Pooladian }, journal={arXiv preprint arXiv:2312.02849}, year={ 2025 } }