ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2502.19317
43
0

Multi-Platform Autobidding with and without Predictions

26 February 2025
Gagan Aggarwal
Anupam Gupta
Xizhi Tan
Mingfei Zhao
ArXivPDFHTML
Abstract

We study the problem of finding the optimal bidding strategy for an advertiser in a multi-platform auction setting. The competition on a platform is captured by a value and a cost function, mapping bidding strategies to value and cost respectively. We assume a diminishing returns property, whereby the marginal cost is increasing in value. The advertiser uses an autobidder that selects a bidding strategy for each platform, aiming to maximize total value subject to budget and return-on-spend constraint. The advertiser has no prior information and learns about the value and cost functions by querying a platform with a specific bidding strategy. Our goal is to design algorithms that find the optimal bidding strategy with a small number of queries.We first present an algorithm that requires \(O(m \log (mn) \log n)\) queries, where mmm is the number of platforms and nnn is the number of possible bidding strategies in each platform. Moreover, we adopt the learning-augmented framework and propose an algorithm that utilizes a (possibly erroneous) prediction of the optimal bidding strategy. We provide a O(mlog⁡(mη)log⁡η)O(m \log (m\eta) \log \eta)O(mlog(mη)logη) query-complexity bound on our algorithm as a function of the prediction error η\etaη. This guarantee gracefully degrades to \(O(m \log (mn) \log n)\). This achieves a ``best-of-both-worlds'' scenario: \(O(m)\) queries when given a correct prediction, and \(O(m \log (mn) \log n)\) even for an arbitrary incorrect prediction.

View on arXiv
@article{aggarwal2025_2502.19317,
  title={ Multi-Platform Autobidding with and without Predictions },
  author={ Gagan Aggarwal and Anupam Gupta and Xizhi Tan and Mingfei Zhao },
  journal={arXiv preprint arXiv:2502.19317},
  year={ 2025 }
}
Comments on this paper