33
0

Generative Image Compression by Estimating Gradients of the Rate-variable Feature Distribution

Main:9 Pages
9 Figures
Bibliography:3 Pages
1 Tables
Appendix:5 Pages
Abstract

While learned image compression (LIC) focuses on efficient data transmission, generative image compression (GIC) extends this framework by integrating generative modeling to produce photo-realistic reconstructed images. In this paper, we propose a novel diffusion-based generative modeling framework tailored for generative image compression. Unlike prior diffusion-based approaches that indirectly exploit diffusion modeling, we reinterpret the compression process itself as a forward diffusion path governed by stochastic differential equations (SDEs). A reverse neural network is trained to reconstruct images by reversing the compression process directly, without requiring Gaussian noise initialization. This approach achieves smooth rate adjustment and photo-realistic reconstructions with only a minimal number of sampling steps. Extensive experiments on benchmark datasets demonstrate that our method outperforms existing generative image compression approaches across a range of metrics, including perceptual distortion, statistical fidelity, and no-reference quality assessments.

View on arXiv
@article{han2025_2505.20984,
  title={ Generative Image Compression by Estimating Gradients of the Rate-variable Feature Distribution },
  author={ Minghao Han and Weiyi You and Jinhua Zhang and Leheng Zhang and Ce Zhu and Shuhang Gu },
  journal={arXiv preprint arXiv:2505.20984},
  year={ 2025 }
}
Comments on this paper