Convergence analysis of block Gibbs samplers for Bayesian probit linear mixed model

In this article, we consider Markov chain Monte Carlo (MCMC) algorithms for exploring the intractable posterior densities associated with Bayesian probit linear mixed models under both proper and improper priors on the regression coefficients and variance components. In particular, we construct two-block Gibbs samplers using the data augmentation (DA) techniques. Furthermore, we prove the geometric ergodicity of the Gibbs samplers, which is the foundation for building the central limit theorems for MCMC based estimators and subsequent inferences. Under proper priors, the Gibbs sampler chain is practically always geometrically ergodic. While under improper priors, the conditions for geometric convergence are similar to those guaranteeing posterior propriety. We also provide conditions for posterior propriety when the design matrices take commonly observed forms. The Haar parameter expansion for DA (PX-DA) algorithm is an improvement of the DA algorithm and it has been shown that it is theoretically at least as good as the DA algorithm. We propose corresponding Haar PX-DA algorithms, which have essentially the same computational cost as the two-block Gibbs samplers. An example is used to show the efficiency gain of the Haar PX-DA algorithm over the block Gibbs sampler.
View on arXiv