ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1810.03037
  4. Cited By
Why do Larger Models Generalize Better? A Theoretical Perspective via
  the XOR Problem

Why do Larger Models Generalize Better? A Theoretical Perspective via the XOR Problem

6 October 2018
Alon Brutzkus
Amir Globerson
    MLT
ArXivPDFHTML

Papers citing "Why do Larger Models Generalize Better? A Theoretical Perspective via the XOR Problem"

2 / 2 papers shown
Title
On the Power and Limitations of Random Features for Understanding Neural
  Networks
On the Power and Limitations of Random Features for Understanding Neural Networks
Gilad Yehudai
Ohad Shamir
MLT
26
181
0
01 Apr 2019
Gradient Descent with Early Stopping is Provably Robust to Label Noise
  for Overparameterized Neural Networks
Gradient Descent with Early Stopping is Provably Robust to Label Noise for Overparameterized Neural Networks
Mingchen Li
Mahdi Soltanolkotabi
Samet Oymak
NoLa
47
351
0
27 Mar 2019
1