ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2011.08797
19
1

Linear Separation via Optimism

17 November 2020
Rafael Hanashiro
Jacob D. Abernethy
ArXivPDFHTML
Abstract

Binary linear classification has been explored since the very early days of the machine learning literature. Perhaps the most classical algorithm is the Perceptron, where a weight vector used to classify examples is maintained, and additive updates are made as incorrect examples are discovered. The Perceptron has been thoroughly studied and several versions have been proposed over many decades. The key theoretical fact about the Perceptron is that, so long as a perfect linear classifier exists with some margin γ>0\gamma > 0γ>0, the number of required updates to find such a perfect linear separator is bounded by 1γ2\frac{1}{\gamma^2}γ21​. What has never been fully addressed is: does there exist an algorithm that can achieve this with fewer updates? In this paper we answer this in the affirmative: we propose the Optimistic Perceptron algorithm, a simple procedure that finds a separating hyperplane in no more than 1γ\frac{1}{\gamma}γ1​ updates. We also show experimentally that this procedure can significantly outperform Perceptron.

View on arXiv
Comments on this paper