12
0

Deep generative models as the probability transformation functions

Main:9 Pages
6 Figures
Bibliography:3 Pages
Abstract

This paper introduces a unified theoretical perspective that views deep generative models as probability transformation functions. Despite the apparent differences in architecture and training methodologies among various types of generative models - autoencoders, autoregressive models, generative adversarial networks, normalizing flows, diffusion models, and flow matching - we demonstrate that they all fundamentally operate by transforming simple predefined distributions into complex target data distributions. This unifying perspective facilitates the transfer of methodological improvements between model architectures and provides a foundation for developing universal theoretical approaches, potentially leading to more efficient and effective generative modeling techniques.

View on arXiv
@article{bondar2025_2506.17171,
  title={ Deep generative models as the probability transformation functions },
  author={ Vitalii Bondar and Vira Babenko and Roman Trembovetskyi and Yurii Korobeinyk and Viktoriya Dzyuba },
  journal={arXiv preprint arXiv:2506.17171},
  year={ 2025 }
}
Comments on this paper