19
31

Stochastic Variance-Reduced Hamilton Monte Carlo Methods

Difan Zou
Pan Xu
Quanquan Gu
Abstract

We propose a fast stochastic Hamilton Monte Carlo (HMC) method, for sampling from a smooth and strongly log-concave distribution. At the core of our proposed method is a variance reduction technique inspired by the recent advance in stochastic optimization. We show that, to achieve ϵ\epsilon accuracy in 2-Wasserstein distance, our algorithm achieves O~(n+κ2d1/2/ϵ+κ4/3d1/3n2/3/ϵ2/3)\tilde O(n+\kappa^{2}d^{1/2}/\epsilon+\kappa^{4/3}d^{1/3}n^{2/3}/\epsilon^{2/3}) gradient complexity (i.e., number of component gradient evaluations), which outperforms the state-of-the-art HMC and stochastic gradient HMC methods in a wide regime. We also extend our algorithm for sampling from smooth and general log-concave distributions, and prove the corresponding gradient complexity as well. Experiments on both synthetic and real data demonstrate the superior performance of our algorithm.

View on arXiv
Comments on this paper