ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1702.05581
18
4

Revisiting Perceptron: Efficient and Label-Optimal Learning of Halfspaces

18 February 2017
Songbai Yan
Chicheng Zhang
ArXivPDFHTML
Abstract

It has been a long-standing problem to efficiently learn a halfspace using as few labels as possible in the presence of noise. In this work, we propose an efficient Perceptron-based algorithm for actively learning homogeneous halfspaces under the uniform distribution over the unit sphere. Under the bounded noise condition~\cite{MN06}, where each label is flipped with probability at most η<12\eta < \frac 1 2η<21​, our algorithm achieves a near-optimal label complexity of O~(d(1−2η)2ln⁡1ϵ)\tilde{O}\left(\frac{d}{(1-2\eta)^2}\ln\frac{1}{\epsilon}\right)O~((1−2η)2d​lnϵ1​) in time O~(d2ϵ(1−2η)3)\tilde{O}\left(\frac{d^2}{\epsilon(1-2\eta)^3}\right)O~(ϵ(1−2η)3d2​). Under the adversarial noise condition~\cite{ABL14, KLS09, KKMS08}, where at most a Ω~(ϵ)\tilde \Omega(\epsilon)Ω~(ϵ) fraction of labels can be flipped, our algorithm achieves a near-optimal label complexity of O~(dln⁡1ϵ)\tilde{O}\left(d\ln\frac{1}{\epsilon}\right)O~(dlnϵ1​) in time O~(d2ϵ)\tilde{O}\left(\frac{d^2}{\epsilon}\right)O~(ϵd2​). Furthermore, we show that our active learning algorithm can be converted to an efficient passive learning algorithm that has near-optimal sample complexities with respect to ϵ\epsilonϵ and ddd.

View on arXiv
Comments on this paper