ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1406.5311
32
26

Towards A Deeper Geometric, Analytic and Algorithmic Understanding of Margins

20 June 2014
Aaditya Ramdas
Javier F. Pena
ArXivPDFHTML
Abstract

Given a matrix AAA, a linear feasibility problem (of which linear classification is a special case) aims to find a solution to a primal problem w:ATw>0w: A^Tw > \textbf{0}w:ATw>0 or a certificate for the dual problem which is a probability distribution p:Ap=0p: Ap = \textbf{0}p:Ap=0. Inspired by the continued importance of "large-margin classifiers" in machine learning, this paper studies a condition measure of AAA called its \textit{margin} that determines the difficulty of both the above problems. To aid geometrical intuition, we first establish new characterizations of the margin in terms of relevant balls, cones and hulls. Our second contribution is analytical, where we present generalizations of Gordan's theorem, and variants of Hoffman's theorems, both using margins. We end by proving some new results on a classical iterative scheme, the Perceptron, whose convergence rates famously depends on the margin. Our results are relevant for a deeper understanding of margin-based learning and proving convergence rates of iterative schemes, apart from providing a unifying perspective on this vast topic.

View on arXiv
Comments on this paper