ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2206.01378
  4. Cited By
Regularization-wise double descent: Why it occurs and how to eliminate
  it

Regularization-wise double descent: Why it occurs and how to eliminate it

3 June 2022
Fatih Yilmaz
Reinhard Heckel
ArXivPDFHTML

Papers citing "Regularization-wise double descent: Why it occurs and how to eliminate it"

10 / 10 papers shown
Title
On Regularization via Early Stopping for Least Squares Regression
On Regularization via Early Stopping for Least Squares Regression
Rishi Sonthalia
Jackie Lok
E. Rebrova
33
2
0
06 Jun 2024
Survival of the Fittest Representation: A Case Study with Modular
  Addition
Survival of the Fittest Representation: A Case Study with Modular Addition
Xiaoman Delores Ding
Zifan Carl Guo
Eric J. Michaud
Ziming Liu
Max Tegmark
48
3
0
27 May 2024
The Quest of Finding the Antidote to Sparse Double Descent
The Quest of Finding the Antidote to Sparse Double Descent
Victor Quétu
Marta Milovanović
34
0
0
31 Aug 2023
Sparse Double Descent in Vision Transformers: real or phantom threat?
Sparse Double Descent in Vision Transformers: real or phantom threat?
Victor Quétu
Marta Milovanović
Enzo Tartaglione
24
2
0
26 Jul 2023
Least Squares Regression Can Exhibit Under-Parameterized Double Descent
Least Squares Regression Can Exhibit Under-Parameterized Double Descent
Xinyue Li
Rishi Sonthalia
38
3
0
24 May 2023
DSD$^2$: Can We Dodge Sparse Double Descent and Compress the Neural
  Network Worry-Free?
DSD2^22: Can We Dodge Sparse Double Descent and Compress the Neural Network Worry-Free?
Victor Quétu
Enzo Tartaglione
32
7
0
02 Mar 2023
Can we avoid Double Descent in Deep Neural Networks?
Can we avoid Double Descent in Deep Neural Networks?
Victor Quétu
Enzo Tartaglione
AI4CE
20
3
0
26 Feb 2023
Asymptotics of the Sketched Pseudoinverse
Asymptotics of the Sketched Pseudoinverse
Daniel LeJeune
Pratik V. Patil
Hamid Javadi
Richard G. Baraniuk
R. Tibshirani
24
10
0
07 Nov 2022
Omnigrok: Grokking Beyond Algorithmic Data
Omnigrok: Grokking Beyond Algorithmic Data
Ziming Liu
Eric J. Michaud
Max Tegmark
56
76
0
03 Oct 2022
Double Trouble in Double Descent : Bias and Variance(s) in the Lazy
  Regime
Double Trouble in Double Descent : Bias and Variance(s) in the Lazy Regime
Stéphane dÁscoli
Maria Refinetti
Giulio Biroli
Florent Krzakala
93
152
0
02 Mar 2020
1