45
0

Rethinking Probabilistic Circuit Parameter Learning

Main:3 Pages
2 Figures
Bibliography:2 Pages
Appendix:10 Pages
Abstract

Probabilistic Circuits (PCs) offer a computationally scalable framework for generative modeling, supporting exact and efficient inference of a wide range of probabilistic queries. While recent advances have significantly improved the expressiveness and scalability of PCs, effectively training their parameters remains a challenge. In particular, a widely used optimization method, full-batch Expectation-Maximization (EM), requires processing the entire dataset before performing a single update, making it ineffective for large datasets. While empirical extensions to the mini-batch setting have been proposed, it remains unclear what objective these algorithms are optimizing, making it difficult to assess their theoretical soundness. This paper bridges the gap by establishing a novel connection between the general EM objective and the standard full-batch EM algorithm. Building on this, we derive a theoretically grounded generalization to the mini-batch setting and demonstrate its effectiveness through preliminary empirical results.

View on arXiv
@article{liu2025_2505.19982,
  title={ Rethinking Probabilistic Circuit Parameter Learning },
  author={ Anji Liu and Guy Van den Broeck },
  journal={arXiv preprint arXiv:2505.19982},
  year={ 2025 }
}
Comments on this paper