ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2102.05312
19
24

Improved Algorithms for Efficient Active Learning Halfspaces with Massart and Tsybakov noise

10 February 2021
Chicheng Zhang
Yinan Li
ArXivPDFHTML
Abstract

We give a computationally-efficient PAC active learning algorithm for ddd-dimensional homogeneous halfspaces that can tolerate Massart noise (Massart and N\éd\élec, 2006) and Tsybakov noise (Tsybakov, 2004). Specialized to the η\etaη-Massart noise setting, our algorithm achieves an information-theoretically near-optimal label complexity of O~(d(1−2η)2polylog(1ϵ))\tilde{O}\left( \frac{d}{(1-2\eta)^2} \mathrm{polylog}(\frac1\epsilon) \right)O~((1−2η)2d​polylog(ϵ1​)) under a wide range of unlabeled data distributions (specifically, the family of "structured distributions" defined in Diakonikolas et al. (2020)). Under the more challenging Tsybakov noise condition, we identify two subfamilies of noise conditions, under which our efficient algorithm provides label complexity guarantees strictly lower than passive learning algorithms.

View on arXiv
Comments on this paper