ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1912.02292
  4. Cited By
Deep Double Descent: Where Bigger Models and More Data Hurt

Deep Double Descent: Where Bigger Models and More Data Hurt

4 December 2019
Preetum Nakkiran
Gal Kaplun
Yamini Bansal
Tristan Yang
Boaz Barak
Ilya Sutskever
ArXivPDFHTML

Papers citing "Deep Double Descent: Where Bigger Models and More Data Hurt"

5 / 205 papers shown
Title
On the Existence of Simpler Machine Learning Models
On the Existence of Simpler Machine Learning Models
Lesia Semenova
Cynthia Rudin
Ronald E. Parr
26
85
0
05 Aug 2019
Regularity Normalization: Neuroscience-Inspired Unsupervised Attention
  across Neural Network Layers
Regularity Normalization: Neuroscience-Inspired Unsupervised Attention across Neural Network Layers
Baihan Lin
16
2
0
27 Feb 2019
On a Sparse Shortcut Topology of Artificial Neural Networks
On a Sparse Shortcut Topology of Artificial Neural Networks
Fenglei Fan
Dayang Wang
Hengtao Guo
Qikui Zhu
Pingkun Yan
Ge Wang
Hengyong Yu
38
22
0
22 Nov 2018
Optimal ridge penalty for real-world high-dimensional data can be zero
  or negative due to the implicit ridge regularization
Optimal ridge penalty for real-world high-dimensional data can be zero or negative due to the implicit ridge regularization
D. Kobak
Jonathan Lomond
Benoit Sanchez
33
89
0
28 May 2018
High-dimensional dynamics of generalization error in neural networks
High-dimensional dynamics of generalization error in neural networks
Madhu S. Advani
Andrew M. Saxe
AI4CE
87
464
0
10 Oct 2017
Previous
12345