ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2009.01947
8
7

Practical and Parallelizable Algorithms for Non-Monotone Submodular Maximization with Size Constraint

3 September 2020
Yixing Chen
Alan Kuhnle
ArXivPDFHTML
Abstract

We present combinatorial and parallelizable algorithms for maximization of a submodular function, not necessarily monotone, with respect to a size constraint. We improve the best approximation factor achieved by an algorithm that has optimal adaptivity and nearly optimal query complexity to 0.193−ε0.193 - \varepsilon0.193−ε. The conference version of this work mistakenly employed a subroutine that does not work for non-monotone, submodular functions. In this version, we propose a fixed and improved subroutine to add a set with high average marginal gain, ThreshSeq, which returns a solution in O(log⁡(n))O( \log(n) )O(log(n)) adaptive rounds with high probability. Moreover, we provide two approximation algorithms. The first has approximation ratio 1/6−ε1/6 - \varepsilon1/6−ε, adaptivity O(log⁡(n))O( \log (n) )O(log(n)), and query complexity O(nlog⁡(k))O( n \log (k) )O(nlog(k)), while the second has approximation ratio 0.193−ε0.193 - \varepsilon0.193−ε, adaptivity O(log⁡2(n))O( \log^2 (n) )O(log2(n)), and query complexity O(nlog⁡(k))O(n \log (k))O(nlog(k)). Our algorithms are empirically validated to use a low number of adaptive rounds and total queries while obtaining solutions with high objective value in comparison with state-of-the-art approximation algorithms, including continuous algorithms that use the multilinear extension.

View on arXiv
Comments on this paper