ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2305.10292
6
8

Linear Query Approximation Algorithms for Non-monotone Submodular Maximization under Knapsack Constraint

17 May 2023
Canh V. Pham
Tan D. Tran
Dung T. K. Ha
My T. Thai
ArXivPDFHTML
Abstract

This work, for the first time, introduces two constant factor approximation algorithms with linear query complexity for non-monotone submodular maximization over a ground set of size nnn subject to a knapsack constraint, DLA\mathsf{DLA}DLA and RLA\mathsf{RLA}RLA. DLA\mathsf{DLA}DLA is a deterministic algorithm that provides an approximation factor of 6+ϵ6+\epsilon6+ϵ while RLA\mathsf{RLA}RLA is a randomized algorithm with an approximation factor of 4+ϵ4+\epsilon4+ϵ. Both run in O(nlog⁡(1/ϵ)/ϵ)O(n \log(1/\epsilon)/\epsilon)O(nlog(1/ϵ)/ϵ) query complexity. The key idea to obtain a constant approximation ratio with linear query lies in: (1) dividing the ground set into two appropriate subsets to find the near-optimal solution over these subsets with linear queries, and (2) combining a threshold greedy with properties of two disjoint sets or a random selection process to improve solution quality. In addition to the theoretical analysis, we have evaluated our proposed solutions with three applications: Revenue Maximization, Image Summarization, and Maximum Weighted Cut, showing that our algorithms not only return comparative results to state-of-the-art algorithms but also require significantly fewer queries.

View on arXiv
Comments on this paper