ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2002.04486
  4. Cited By
Implicit Bias of Gradient Descent for Wide Two-layer Neural Networks
  Trained with the Logistic Loss

Implicit Bias of Gradient Descent for Wide Two-layer Neural Networks Trained with the Logistic Loss

11 February 2020
Lénaïc Chizat
Francis R. Bach
    MLT
ArXivPDFHTML

Papers citing "Implicit Bias of Gradient Descent for Wide Two-layer Neural Networks Trained with the Logistic Loss"

2 / 252 papers shown
Title
A Precise High-Dimensional Asymptotic Theory for Boosting and
  Minimum-$\ell_1$-Norm Interpolated Classifiers
A Precise High-Dimensional Asymptotic Theory for Boosting and Minimum-ℓ1\ell_1ℓ1​-Norm Interpolated Classifiers
Tengyuan Liang
Pragya Sur
33
68
0
05 Feb 2020
Generalization Properties of hyper-RKHS and its Applications
Generalization Properties of hyper-RKHS and its Applications
Fanghui Liu
Lei Shi
Xiaolin Huang
Jie-jin Yang
Johan A. K. Suykens
20
4
0
26 Sep 2018
Previous
123456