12
0

Rao-Blackwellised Reparameterisation Gradients

Main:14 Pages
10 Figures
Bibliography:1 Pages
5 Tables
Appendix:10 Pages
Abstract

Latent Gaussian variables have been popularised in probabilistic machine learning. In turn, gradient estimators are the machinery that facilitates gradient-based optimisation for models with latent Gaussian variables. The reparameterisation trick is often used as the default estimator as it is simple to implement and yields low-variance gradients for variational inference. In this work, we propose the R2-G2 estimator as the Rao-Blackwellisation of the reparameterisation gradient estimator. Interestingly, we show that the local reparameterisation gradient estimator for Bayesian MLPs is an instance of the R2-G2 estimator and Rao-Blackwellisation. This lets us extend benefits of Rao-Blackwellised gradients to a suite of probabilistic models. We show that initial training with R2-G2 consistently yields better performance in models with multiple applications of the reparameterisation trick.

View on arXiv
@article{lam2025_2506.07687,
  title={ Rao-Blackwellised Reparameterisation Gradients },
  author={ Kevin Lam and Thang Bui and George Deligiannidis and Yee Whye Teh },
  journal={arXiv preprint arXiv:2506.07687},
  year={ 2025 }
}
Comments on this paper