Simulation-based Regularized Logistic Regression

We develop simulation-based methods for regularized logistic regression by exploiting normal mixtures in two ways: using z-distributions to represent the logistic likelihood, and using mixtures of stable distributions to implement regularization penalties including the lasso. By carefully choosing the z-distribution parameterization, and choosing how regularization is applied, we obtain subtly different MCMC sampling schemes with varying efficiency depending on the data type (binary v. binomial, say) and the desired estimator (maximum likelihood, maximum a posteriori, posterior mean, etc.). Advantages of this umbrella approach include flexibility, computational efficiency, application in p \gg n settings, uncertainty estimates, sensitivity analysis, variable selection, and an ability to assess the optimal degree of regularization in a fully Bayesian setup.
View on arXiv