12
33

Fast Stochastic Variance Reduced ADMM for Stochastic Composition Optimization

Abstract

We consider the stochastic composition optimization problem proposed in \cite{wang2017stochastic}, which has applications ranging from estimation to statistical and machine learning. We propose the first ADMM-based algorithm named com-SVR-ADMM, and show that com-SVR-ADMM converges linearly for strongly convex and Lipschitz smooth objectives, and has a convergence rate of O(logS/S)O( \log S/S), which improves upon the O(S4/9)O(S^{-4/9}) rate in \cite{wang2016accelerating} when the objective is convex and Lipschitz smooth. Moreover, com-SVR-ADMM possesses a rate of O(1/S)O(1/\sqrt{S}) when the objective is convex but without Lipschitz smoothness. We also conduct experiments and show that it outperforms existing algorithms.

View on arXiv
Comments on this paper