57
34

Improving Gibbs Sampling Predictions on Unseen Data for Latent Dirichlet Allocation

Abstract

Latent Dirichlet Allocation (LDA) is a model for discovering the underlying structure of a given data set. LDA and its extensions have been used in unsupervised and supervised learning tasks across a variety of data types including textual, image and biological data. Several methods have been presented for approximate inference of LDA parameters, including Variational Bayes (VB), Collapsed Gibbs Sampling (CGS) and Collapsed Variational Bayes (CVB) techniques. This work explores three novel methods for generating LDA predictions on unobserved data, given a model trained by CGS. We present extensive experiments on real-world data sets for both standard unsupervised LDA and Prior LDA, one of the supervised variants of LDA for multi-label data. In both supervised and unsupervised settings, we perform extensive empirical comparison of our prediction methods with the standard predictions generated by CGS and CVB0 (a variant of CVB). The results show a consistent advantage of one of our methods over CGS under all experimental conditions, and over CVB0 under the majority of conditions.

View on arXiv
Comments on this paper