Variational inference for large-scale models of discrete choice

Discrete choice models are a type of hierarchical model widely used in applied statistics. When the decision-makers in the hierarchy are not assumed to have identical preferences, exact estimation and inference become intractable. Markov chain Monte Carlo (MCMC) techniques make approximate inference possible, but the computational cost is prohibitive on the large data sets now becoming routinely available. Variational methods provide a deterministic alternative to approximation of the posterior distribution. We derive variational procedures for empirical Bayes and fully Bayesian inference in the mixed multinomial logit model of discrete choice. The algorithms require only that we solve a sequence of unconstrained optimization problems, which are proven to be convex. Extensive simulations demonstrate that variational methods achieve accuracy competitive with MCMC, at a small fraction of the computational cost.
View on arXiv