11
18

Submodular + Concave

Abstract

It has been well established that first order optimization methods can converge to the maximal objective value of concave functions and provide constant factor approximation guarantees for (non-convex/non-concave) continuous submodular functions. In this work, we initiate the study of the maximization of functions of the form F(x)=G(x)+C(x)F(x) = G(x) +C(x) over a solvable convex body PP, where GG is a smooth DR-submodular function and CC is a smooth concave function. This class of functions is a strict extension of both concave and continuous DR-submodular functions for which no theoretical guarantee is known. We provide a suite of Frank-Wolfe style algorithms, which, depending on the nature of the objective function (i.e., if GG and CC are monotone or not, and non-negative or not) and on the nature of the set PP (i.e., whether it is downward closed or not), provide 11/e1-1/e, 1/e1/e, or 1/21/2 approximation guarantees. We then use our algorithms to get a framework to smoothly interpolate between choosing a diverse set of elements from a given ground set (corresponding to the mode of a determinantal point process) and choosing a clustered set of elements (corresponding to the maxima of a suitable concave function). Additionally, we apply our algorithms to various functions in the above class (DR-submodular + concave) in both constrained and unconstrained settings, and show that our algorithms consistently outperform natural baselines.

View on arXiv
Comments on this paper