Structured and Informed Probabilistic Modeling with the Thermodynamic Kolmogorov-Arnold Model

We adapt the Kolmogorov-Arnold Representation Theorem to generative modeling by reinterpreting its inner functions as a Markov Kernel between probability spaces via inverse transform sampling. We present a generative model that is interpretable, easy to design, and efficient. Our approach couples a Kolmogorov-Arnold Network generator with independent energy-based priors, trained via Maximum Likelihood. Inverse sampling enables fast inference, while prior knowledge can be incorporated before training to better align priors with posteriors, thereby improving learning efficiency and sample quality. The learned prior is also recoverable and visualizable post-training, offering an empirical Bayes perspective. To address inflexibility and mitigate prior-posterior mismatch, we introduce scalable extensions based on mixture distributions and Langevin Monte Carlo methods, admitting a trade-off between flexibility and training efficiency. Our contributions connect classical representation theorems with modern probabilistic modeling, while balancing training stability, inference speed, and the quality and diversity of generations.
View on arXiv@article{raj2025_2506.14167, title={ Structured and Informed Probabilistic Modeling with the Thermodynamic Kolmogorov-Arnold Model }, author={ Prithvi Raj }, journal={arXiv preprint arXiv:2506.14167}, year={ 2025 } }