We present a new algorithm to optimize distributions defined implicitly by parameterized stochastic diffusions. Doing so allows us to modify the outcome distribution of sampling processes by optimizing over their parameters. We introduce a general framework for first-order optimization of these processes, that performs jointly, in a single loop, optimization and sampling steps. This approach is inspired by recent advances in bilevel optimization and automatic implicit differentiation, leveraging the point of view of sampling as optimization over the space of probability distributions. We provide theoretical guarantees on the performance of our method, as well as experimental results demonstrating its effectiveness. We apply it to training energy-based models and finetuning denoising diffusions.
View on arXiv@article{marion2025_2402.05468, title={ Implicit Diffusion: Efficient Optimization through Stochastic Sampling }, author={ Pierre Marion and Anna Korba and Peter Bartlett and Mathieu Blondel and Valentin De Bortoli and Arnaud Doucet and Felipe Llinares-López and Courtney Paquette and Quentin Berthet }, journal={arXiv preprint arXiv:2402.05468}, year={ 2025 } }