ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1706.01846
21
9
v1v2v3v4v5v6v7 (latest)

Convergence analysis of block Gibbs samplers for Bayesian probit linear mixed models

6 June 2017
Xin Wang
Vivekananda Roy
ArXiv (abs)PDFHTML
Abstract

In this article, we consider Markov chain Monte Carlo (MCMC) algorithms for exploring the intractable posterior densities associated with Bayesian probit linear mixed models under both proper and improper priors on the regression coefficients and variance components. In particular, we construct two-block Gibbs samplers using the data augmentation (DA) techniques. Furthermore, we prove geometric ergodicity of the Gibbs samplers, which is the foundation for building central limit theorems for MCMC based estimators and subsequent inferences. Under improper priors, the conditions for geometric convergence are similar to those guaranteeing posterior propriety. We also provide conditions for posterior propriety when the design matrices take commonly observed forms. In general, the Haar parameter expansion for DA (PX-DA) algorithm is an improvement of the DA algorithm and it has been shown that it is theoretically at least as good as the DA algorithm. For probit linear mixed models, we propose corresponding Haar PX-DA algorithms, which have essentially the same computational cost as the two-block Gibbs samplers. An example is used to show the efficiency gains of the Haar PX-DA algorithms over the block Gibbs samplers and full Gibbs samplers.

View on arXiv
Comments on this paper