ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.08141
78
0

Scaling Probabilistic Circuits via Data Partitioning

11 March 2025
Jonas Seng
Florian Peter Busch
P. Prasad
Devendra Singh Dhami
Martin Mundt
Kristian Kersting
    TPM
    FedML
ArXivPDFHTML
Abstract

Probabilistic circuits (PCs) enable us to learn joint distributions over a set of random variables and to perform various probabilistic queries in a tractable fashion. Though the tractability property allows PCs to scale beyond non-tractable models such as Bayesian Networks, scaling training and inference of PCs to larger, real-world datasets remains challenging. To remedy the situation, we show how PCs can be learned across multiple machines by recursively partitioning a distributed dataset, thereby unveiling a deep connection between PCs and federated learning (FL). This leads to federated circuits (FCs) -- a novel and flexible federated learning (FL) framework that (1) allows one to scale PCs on distributed learning environments (2) train PCs faster and (3) unifies for the first time horizontal, vertical, and hybrid FL in one framework by re-framing FL as a density estimation problem over distributed datasets. We demonstrate FC's capability to scale PCs on various large-scale datasets. Also, we show FC's versatility in handling horizontal, vertical, and hybrid FL within a unified framework on multiple classification tasks.

View on arXiv
@article{seng2025_2503.08141,
  title={ Scaling Probabilistic Circuits via Data Partitioning },
  author={ Jonas Seng and Florian Peter Busch and Pooja Prasad and Devendra Singh Dhami and Martin Mundt and Kristian Kersting },
  journal={arXiv preprint arXiv:2503.08141},
  year={ 2025 }
}
Comments on this paper