ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2506.10433
104
1

Measuring Semantic Information Production in Generative Diffusion Models

12 June 2025
Florian Handke
Felix Koulischer
G. Raya
L. Ambrogioni
    DiffM
ArXiv (abs)PDFHTML
Main:4 Pages
13 Figures
Bibliography:1 Pages
Appendix:8 Pages
Abstract

It is well known that semantic and structural features of the generated images emerge at different times during the reverse dynamics of diffusion, a phenomenon that has been connected to physical phase transitions in magnets and other materials. In this paper, we introduce a general information-theoretic approach to measure when these class-semantic "decisions" are made during the generative process. By using an online formula for the optimal Bayesian classifier, we estimate the conditional entropy of the class label given the noisy state. We then determine the time intervals corresponding to the highest information transfer between noisy states and class labels using the time derivative of the conditional entropy. We demonstrate our method on one-dimensional Gaussian mixture models and on DDPM models trained on the CIFAR10 dataset. As expected, we find that the semantic information transfer is highest in the intermediate stages of diffusion while vanishing during the final stages. However, we found sizable differences between the entropy rate profiles of different classes, suggesting that different "semantic decisions" are located at different intermediate times.

View on arXiv
@article{handke2025_2506.10433,
  title={ Measuring Semantic Information Production in Generative Diffusion Models },
  author={ Florian Handke and Félix Koulischer and Gabriel Raya and Luca Ambrogioni },
  journal={arXiv preprint arXiv:2506.10433},
  year={ 2025 }
}
Comments on this paper