Semi-supervised Nonnegative Matrix Factorization for Document Classification
Jamie Haddock
Lara Kassab
Sixian Li
Alona Kryshchenko
Rachel Grotheer
Elena Sizikova
Chuntian Wang
Thomas Merkh
R. W. M. A. Madushani
Miju Ahn
Deanna Needell
Kathryn Leonard

Abstract
We propose new semi-supervised nonnegative matrix factorization (SSNMF) models for document classification and provide motivation for these models as maximum likelihood estimators. The proposed SSNMF models simultaneously provide both a topic model and a model for classification, thereby offering highly interpretable classification results. We derive training methods using multiplicative updates for each new model, and demonstrate the application of these models to single-label and multi-label document classification, although the models are flexible to other supervised learning tasks such as regression. We illustrate the promise of these models and training methods on document classification datasets (e.g., 20 Newsgroups, Reuters).
View on arXivComments on this paper