ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.21892
46
0

Almost Linear Convergence under Minimal Score Assumptions: Quantized Transition Diffusion

28 May 2025
Xunpeng Huang
Yingyu Lin
Nikki Lijing Kuang
Hanze Dong
Difan Zou
Yian Ma
Tong Zhang
    DiffM
ArXiv (abs)PDFHTML
Main:11 Pages
4 Figures
Bibliography:3 Pages
3 Tables
Appendix:23 Pages
Abstract

Continuous diffusion models have demonstrated remarkable performance in data generation across various domains, yet their efficiency remains constrained by two critical limitations: (1) the local adjacency structure of the forward Markov process, which restricts long-range transitions in the data space, and (2) inherent biases introduced during the simulation of time-inhomogeneous reverse denoising processes. To address these challenges, we propose Quantized Transition Diffusion (QTD), a novel approach that integrates data quantization with discrete diffusion dynamics. Our method first transforms the continuous data distribution p∗p_*p∗​ into a discrete one q∗q_*q∗​ via histogram approximation and binary encoding, enabling efficient representation in a structured discrete latent space. We then design a continuous-time Markov chain (CTMC) with Hamming distance-based transitions as the forward process, which inherently supports long-range movements in the original data space. For reverse-time sampling, we introduce a \textit{truncated uniformization} technique to simulate the reverse CTMC, which can provably provide unbiased generation from q∗q_*q∗​ under minimal score assumptions. Through a novel KL dynamic analysis of the reverse CTMC, we prove that QTD can generate samples with O(dln⁡2(d/ϵ))O(d\ln^2(d/\epsilon))O(dln2(d/ϵ)) score evaluations in expectation to approximate the ddd--dimensional target distribution p∗p_*p∗​ within an ϵ\epsilonϵ error tolerance. Our method not only establishes state-of-the-art inference efficiency but also advances the theoretical foundations of diffusion-based generative modeling by unifying discrete and continuous diffusion paradigms.

View on arXiv
@article{huang2025_2505.21892,
  title={ Almost Linear Convergence under Minimal Score Assumptions: Quantized Transition Diffusion },
  author={ Xunpeng Huang and Yingyu Lin and Nikki Lijing Kuang and Hanze Dong and Difan Zou and Yian Ma and Tong Zhang },
  journal={arXiv preprint arXiv:2505.21892},
  year={ 2025 }
}
Comments on this paper