ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2503.05981
46
0

Near-Polynomially Competitive Active Logistic Regression

7 March 2025
Yihan Zhou
Eric Price
Trung Nguyen
ArXivPDFHTML
Abstract

We address the problem of active logistic regression in the realizable setting. It is well known that active learning can require exponentially fewer label queries compared to passive learning, in some cases using log⁡1\eps\log \frac{1}{\eps}log\eps1​ rather than \poly(1/\eps)\poly(1/\eps)\poly(1/\eps) labels to get error \eps\eps\eps larger than the optimum.We present the first algorithm that is polynomially competitive with the optimal algorithm on every input instance, up to factors polylogarithmic in the error and domain size. In particular, if any algorithm achieves label complexity polylogarithmic in \eps\eps\eps, so does ours. Our algorithm is based on efficient sampling and can be extended to learn more general class of functions. We further support our theoretical results with experiments demonstrating performance gains for logistic regression compared to existing active learning algorithms.

View on arXiv
@article{zhou2025_2503.05981,
  title={ Near-Polynomially Competitive Active Logistic Regression },
  author={ Yihan Zhou and Eric Price and Trung Nguyen },
  journal={arXiv preprint arXiv:2503.05981},
  year={ 2025 }
}
Comments on this paper