19
0

Maximizing Monotone DR-submodular Continuous Functions by Derivative-free Optimization

Abstract

In this paper, we study the problem of monotone (weakly) DR-submodular continuous maximization. While previous methods require the gradient information of the objective function, we propose a derivative-free algorithm LDGM for the first time. We define β\beta and α\alpha to characterize how close a function is to continuous DR-submodulr and submodular, respectively. Under a convex polytope constraint, we prove that LDGM can achieve a (1eβϵ)(1-e^{-\beta}-\epsilon)-approximation guarantee after O(1/ϵ)O(1/\epsilon) iterations, which is the same as the best previous gradient-based algorithm. Moreover, in some special cases, a variant of LDGM can achieve a ((α/2)(1eα)ϵ)((\alpha/2)(1-e^{-\alpha})-\epsilon)-approximation guarantee for (weakly) submodular functions. We also compare LDGM with the gradient-based algorithm Frank-Wolfe under noise, and show that LDGM can be more robust. Empirical results on budget allocation verify the effectiveness of LDGM.

View on arXiv
Comments on this paper