Many real-world tasks, from associative memory to symbolic reasoning, demand discrete, structured representations that standard continuous latent models struggle to express naturally. We introduce the Gaussian-Multinoulli Restricted Boltzmann Machine (GM-RBM), a generative energy-based model that extends the Gaussian-Bernoulli RBM (GB-RBM) by replacing binary hidden units with -state Potts variables. This modification enables a combinatorially richer latent space and supports learning over multivalued, interpretable latent concepts. We formally derive GM-RBM's energy function, learning dynamics, and conditional distributions, showing that it preserves tractable inference and training through contrastive divergence. Empirically, we demonstrate that GM-RBMs model complex multimodal distributions more effectively than binary RBMs, outperforming them on tasks involving analogical recall and structured memory. Our results highlight GM-RBMs as a scalable framework for discrete latent inference with enhanced expressiveness and interoperability.
View on arXiv@article{kapasi2025_2505.11635, title={ The Gaussian-Multinoulli Restricted Boltzmann Machine: A Potts Model Extension of the GRBM }, author={ Nikhil Kapasi and William Whitehead and Luke Theogarajan }, journal={arXiv preprint arXiv:2505.11635}, year={ 2025 } }