2
0

Improving Compositional Generation with Diffusion Models Using Lift Scores

Abstract

We introduce a novel resampling criterion using lift scores, for improving compositional generation in diffusion models. By leveraging the lift scores, we evaluate whether generated samples align with each single condition and then compose the results to determine whether the composed prompt is satisfied. Our key insight is that lift scores can be efficiently approximated using only the original diffusion model, requiring no additional training or external modules. We develop an optimized variant that achieves relatively lower computational overhead during inference while maintaining effectiveness. Through extensive experiments, we demonstrate that lift scores significantly improved the condition alignment for compositional generation across 2D synthetic data, CLEVR position tasks, and text-to-image synthesis. Our code is available atthis http URL.

View on arXiv
@article{yu2025_2505.13740,
  title={ Improving Compositional Generation with Diffusion Models Using Lift Scores },
  author={ Chenning Yu and Sicun Gao },
  journal={arXiv preprint arXiv:2505.13740},
  year={ 2025 }
}
Comments on this paper