ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.21101
60
0

Conditional Diffusion Models with Classifier-Free Gibbs-like Guidance

27 May 2025
Badr Moufad
Yazid Janati
Alain Durmus
Ahmed Ghorbel
Eric Moulines
Jimmy Olsson
    DiffM
ArXiv (abs)PDFHTML
Main:9 Pages
7 Figures
Bibliography:3 Pages
8 Tables
Appendix:20 Pages
Abstract

Classifier-Free Guidance (CFG) is a widely used technique for improving conditional diffusion models by linearly combining the outputs of conditional and unconditional denoisers. While CFG enhances visual quality and improves alignment with prompts, it often reduces sample diversity, leading to a challenging trade-off between quality and diversity. To address this issue, we make two key contributions. First, CFG generally does not correspond to a well-defined denoising diffusion model (DDM). In particular, contrary to common intuition, CFG does not yield samples from the target distribution associated with the limiting CFG score as the noise level approaches zero -- where the data distribution is tilted by a power w>1w \gt 1w>1 of the conditional distribution. We identify the missing component: a Rényi divergence term that acts as a repulsive force and is required to correct CFG and render it consistent with a proper DDM. Our analysis shows that this correction term vanishes in the low-noise limit. Second, motivated by this insight, we propose a Gibbs-like sampling procedure to draw samples from the desired tilted distribution. This method starts with an initial sample from the conditional diffusion model without CFG and iteratively refines it, preserving diversity while progressively enhancing sample quality. We evaluate our approach on both image and text-to-audio generation tasks, demonstrating substantial improvements over CFG across all considered metrics. The code is available atthis https URL

View on arXiv
@article{moufad2025_2505.21101,
  title={ Conditional Diffusion Models with Classifier-Free Gibbs-like Guidance },
  author={ Badr Moufad and Yazid Janati and Alain Durmus and Ahmed Ghorbel and Eric Moulines and Jimmy Olsson },
  journal={arXiv preprint arXiv:2505.21101},
  year={ 2025 }
}
Comments on this paper