Stochastic Conditional Gradient++
In this paper, we develop Stochastic Continuous Greedy++ (SCG++), the first efficient variant of a conditional gradient method for maximizing a continuous submodular function subject to a convex constraint. Concretely, for a monotone and continuous DR-submodular function, SCG++ achieves a tight solution while using stochastic oracle queries and calls to the linear optimization oracle. The best previously known algorithms either achieve a suboptimal solution with stochastic gradients or the tight solution with suboptimal stochastic gradients. SCG++ enjoys optimality in terms of both approximation guarantee and stochastic stochastic oracle queries. Our novel variance reduction method naturally extends to stochastic convex minimization. More precisely, we develop Stochastic Frank-Wolfe++ (SFW++) that achieves an -approximate optimum with only calls to the linear optimization oracle while using stochastic oracle queries in total. Therefore, SFW++ is the first efficient projection-free algorithm that achieves the optimum complexity in terms of stochastic oracle queries.
View on arXiv