ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1706.01846
13
9
v1v2v3v4v5v6v7 (latest)

Convergence analysis of the block Gibbs sampler for Bayesian probit linear mixed models

6 June 2017
Xin Wang
Vivekananda Roy
ArXiv (abs)PDFHTML
Abstract

In this article, we consider Markov chain Monte Carlo (MCMC) algorithms for exploring the intractable posterior densities associated with the Bayesian probit linear mixed models under both proper and improper priors on the regression coefficients and variance components. In particular, we construct two-block Gibbs samplers using the data augmentation (DA) techniques. Furthermore, we prove the geometric ergodicity of the Gibbs samplers, which is the foundation for building the central limit theorems for MCMC based estimators and subsequent inferences. Under proper priors, the Gibbs sampler chain is practically always geometrically ergodic. While under improper priors, the conditions for geometric convergence are similar to those guaranteeing posterior propriety. We also provide conditions for posterior propriety when the design matrices take commonly observed forms. The Haar parameter expansion for DA (PX-DA) algorithm is an improvement of the DA algorithm and it has been shown that it is theoretically at least as good as the DA algorithm. We propose corresponding Haar PX-DA algorithms, which have essentially the same computational cost as the two-block Gibbs samplers. An example is used to show the efficiency gain of the Haar PX-DA algorithm over the block Gibbs sampler.

View on arXiv
Comments on this paper