ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2505.11635
36
0

The Gaussian-Multinoulli Restricted Boltzmann Machine: A Potts Model Extension of the GRBM

16 May 2025
Nikhil Kapasi
William Whitehead
Luke Theogarajan
    AI4CE
ArXiv (abs)PDFHTML
Main:9 Pages
3 Figures
Bibliography:2 Pages
2 Tables
Abstract

Many real-world tasks, from associative memory to symbolic reasoning, demand discrete, structured representations that standard continuous latent models struggle to express naturally. We introduce the Gaussian-Multinoulli Restricted Boltzmann Machine (GM-RBM), a generative energy-based model that extends the Gaussian-Bernoulli RBM (GB-RBM) by replacing binary hidden units with qqq-state Potts variables. This modification enables a combinatorially richer latent space and supports learning over multivalued, interpretable latent concepts. We formally derive GM-RBM's energy function, learning dynamics, and conditional distributions, showing that it preserves tractable inference and training through contrastive divergence. Empirically, we demonstrate that GM-RBMs model complex multimodal distributions more effectively than binary RBMs, outperforming them on tasks involving analogical recall and structured memory. Our results highlight GM-RBMs as a scalable framework for discrete latent inference with enhanced expressiveness and interoperability.

View on arXiv
@article{kapasi2025_2505.11635,
  title={ The Gaussian-Multinoulli Restricted Boltzmann Machine: A Potts Model Extension of the GRBM },
  author={ Nikhil Kapasi and William Whitehead and Luke Theogarajan },
  journal={arXiv preprint arXiv:2505.11635},
  year={ 2025 }
}
Comments on this paper