ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2303.01135
  4. Cited By
Tight Risk Bounds for Gradient Descent on Separable Data

Tight Risk Bounds for Gradient Descent on Separable Data

2 March 2023
Matan Schliserman
Tomer Koren
ArXivPDFHTML

Papers citing "Tight Risk Bounds for Gradient Descent on Separable Data"

5 / 5 papers shown
Title
Stability vs Implicit Bias of Gradient Methods on Separable Data and
  Beyond
Stability vs Implicit Bias of Gradient Methods on Separable Data and Beyond
Matan Schliserman
Tomer Koren
45
24
0
27 Feb 2022
Stochastic linear optimization never overfits with quadratically-bounded
  losses on general data
Stochastic linear optimization never overfits with quadratically-bounded losses on general data
Matus Telgarsky
39
12
0
14 Feb 2022
Characterizing the implicit bias via a primal-dual analysis
Characterizing the implicit bias via a primal-dual analysis
Ziwei Ji
Matus Telgarsky
43
20
0
11 Jun 2019
Risk and parameter convergence of logistic regression
Risk and parameter convergence of logistic regression
Ziwei Ji
Matus Telgarsky
71
130
0
20 Mar 2018
Convergence of Gradient Descent on Separable Data
Convergence of Gradient Descent on Separable Data
Mor Shpigel Nacson
Jason D. Lee
Suriya Gunasekar
Pedro H. P. Savarese
Nathan Srebro
Daniel Soudry
67
169
0
05 Mar 2018
1