ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2501.12982
139
6
v1v2 (latest)

Low-dimensional adaptation of diffusion models: Convergence in total variation

22 January 2025
Jiadong Liang
Zhihan Huang
Yukang Chen
    DiffM
ArXiv (abs)PDFHTML
Main:53 Pages
Bibliography:3 Pages
2 Tables
Appendix:1 Pages
Abstract

This paper investigates how diffusion generative models leverage (unknown) low-dimensional structure to accelerate sampling. Focusing on two mainstream samplers -- the denoising diffusion implicit model (DDIM) and the denoising diffusion probabilistic model (DDPM) -- and assuming accurate score estimates, we prove that their iteration complexities are no greater than the order of k/εk/\varepsilonk/ε (up to some log factor), where ε\varepsilonε is the precision in total variation distance and kkk is some intrinsic dimension of the target distribution. Our results are applicable to a broad family of target distributions without requiring smoothness or log-concavity assumptions. Further, we develop a lower bound that suggests the (near) necessity of the coefficients introduced by Ho et al.(2020) and Song et al.(2020) in facilitating low-dimensional adaptation. Our findings provide the first rigorous evidence for the adaptivity of the DDIM-type samplers to unknown low-dimensional structure, and improve over the state-of-the-art DDPM theory regarding total variation convergence.

View on arXiv
@article{liang2025_2501.12982,
  title={ Low-dimensional adaptation of diffusion models: Convergence in total variation },
  author={ Jiadong Liang and Zhihan Huang and Yuxin Chen },
  journal={arXiv preprint arXiv:2501.12982},
  year={ 2025 }
}
Comments on this paper