Mixing times of data-augmentation Gibbs samplers for high-dimensional probit regression

We investigate the convergence properties of popular data-augmentation samplers for Bayesian probit regression. Leveraging recent results on Gibbs samplers for log-concave targets, we provide simple and explicit non-asymptotic bounds on the associated mixing times (in Kullback-Leibler divergence). The bounds depend explicitly on the design matrix and the prior precision, while they hold uniformly over the vector of responses. We specialize the results for different regimes of statistical interest, when both the number of data points and parameters are large: in particular we identify scenarios where the mixing times remain bounded as , and ones where they do not. The results are shown to be tight (in the worst case with respect to the responses) and provide guidance on choices of prior distributions that provably lead to fast mixing. An empirical analysis based on coupling techniques suggests that the bounds are effective in predicting practically observed behaviours.
View on arXiv@article{ascolani2025_2505.14343, title={ Mixing times of data-augmentation Gibbs samplers for high-dimensional probit regression }, author={ Filippo Ascolani and Giacomo Zanella }, journal={arXiv preprint arXiv:2505.14343}, year={ 2025 } }