In this article, we consider Markov chain Monte Carlo (MCMC) algorithms for exploring the intractable posterior densities associated with Bayesian probit linear mixed models under both proper and improper priors on the regression coefficients and variance components. In particular, we construct two-block Gibbs samplers using the data augmentation (DA) techniques. Furthermore, we prove geometric ergodicity of the Gibbs samplers, which is the foundation for building central limit theorems for MCMC based estimators and subsequent inferences. Under improper priors, the conditions for geometric convergence are similar to those guaranteeing posterior propriety. We also provide conditions for posterior propriety when the design matrices take commonly observed forms. In general, the Haar parameter expansion for DA (PX-DA) algorithm is an improvement of the DA algorithm and it has been shown that it is theoretically at least as good as the DA algorithm. For probit linear mixed models, we propose corresponding Haar PX-DA algorithms, which have essentially the same computational cost as the two-block Gibbs samplers. An example is used to show the efficiency gains of the Haar PX-DA algorithms over the block Gibbs samplers and full Gibbs samplers.
View on arXiv