137
6
v1v2 (latest)

Low-dimensional adaptation of diffusion models: Convergence in total variation

Main:53 Pages
Bibliography:3 Pages
2 Tables
Appendix:1 Pages
Abstract

This paper investigates how diffusion generative models leverage (unknown) low-dimensional structure to accelerate sampling. Focusing on two mainstream samplers -- the denoising diffusion implicit model (DDIM) and the denoising diffusion probabilistic model (DDPM) -- and assuming accurate score estimates, we prove that their iteration complexities are no greater than the order of k/εk/\varepsilon (up to some log factor), where ε\varepsilon is the precision in total variation distance and kk is some intrinsic dimension of the target distribution. Our results are applicable to a broad family of target distributions without requiring smoothness or log-concavity assumptions. Further, we develop a lower bound that suggests the (near) necessity of the coefficients introduced by Ho et al.(2020) and Song et al.(2020) in facilitating low-dimensional adaptation. Our findings provide the first rigorous evidence for the adaptivity of the DDIM-type samplers to unknown low-dimensional structure, and improve over the state-of-the-art DDPM theory regarding total variation convergence.

View on arXiv
@article{liang2025_2501.12982,
  title={ Low-dimensional adaptation of diffusion models: Convergence in total variation },
  author={ Jiadong Liang and Zhihan Huang and Yuxin Chen },
  journal={arXiv preprint arXiv:2501.12982},
  year={ 2025 }
}
Comments on this paper