Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1903.11680
Cited By
Gradient Descent with Early Stopping is Provably Robust to Label Noise for Overparameterized Neural Networks
27 March 2019
Mingchen Li
Mahdi Soltanolkotabi
Samet Oymak
NoLa
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Gradient Descent with Early Stopping is Provably Robust to Label Noise for Overparameterized Neural Networks"
50 / 176 papers shown
Title
Contrastive Representations for Label Noise Require Fine-Tuning
Pierre Nodet
V. Lemaire
A. Bondu
Antoine Cornuéjols
15
1
0
20 Aug 2021
Confidence Adaptive Regularization for Deep Learning with Noisy Labels
Yangdi Lu
Yang Bo
Wenbo He
NoLa
24
10
0
18 Aug 2021
Implicit Sparse Regularization: The Impact of Depth and Early Stopping
Jiangyuan Li
Thanh V. Nguyen
C. Hegde
R. K. Wong
14
29
0
12 Aug 2021
Learning with Noisy Labels via Sparse Regularization
Xiong Zhou
Xianming Liu
Chenyang Wang
Deming Zhai
Junjun Jiang
Xiangyang Ji
NoLa
26
51
0
31 Jul 2021
Adaptive Precision Training (AdaPT): A dynamic fixed point quantized training approach for DNNs
Lorenz Kummer
Kevin Sidak
Tabea Reichmann
Wilfried Gansterer
MQ
24
5
0
28 Jul 2021
A Tale Of Two Long Tails
Daniel D'souza
Zach Nussbaum
Chirag Agarwal
Sara Hooker
29
22
0
27 Jul 2021
Stability & Generalisation of Gradient Descent for Shallow Neural Networks without the Neural Tangent Kernel
Dominic Richards
Ilja Kuzborskij
13
28
0
27 Jul 2021
An Instance-Dependent Simulation Framework for Learning with Label Noise
Keren Gu
Xander Masotto
Vandana Bachani
Balaji Lakshminarayanan
Jack Nikodem
Dong Yin
NoLa
11
24
0
23 Jul 2021
Disparity Between Batches as a Signal for Early Stopping
Mahsa Forouzesh
Patrick Thiran
39
7
0
14 Jul 2021
Generalization by design: Shortcuts to Generalization in Deep Learning
P. Táborský
Lars Kai Hansen
OOD
AI4CE
10
0
0
05 Jul 2021
A Theoretical Analysis of Fine-tuning with Linear Teachers
Gal Shachaf
Alon Brutzkus
Amir Globerson
34
17
0
04 Jul 2021
A Theory-Driven Self-Labeling Refinement Method for Contrastive Representation Learning
Pan Zhou
Caiming Xiong
Xiaotong Yuan
S. Hoi
SSL
19
12
0
28 Jun 2021
Towards Understanding Deep Learning from Noisy Labels with Small-Loss Criterion
Xian-Jin Gui
Wei Wang
Zhang-Hao Tian
NoLa
27
44
0
17 Jun 2021
Influential Rank: A New Perspective of Post-training for Robust Model against Noisy Labels
Seulki Park
Hwanjun Song
Daeho Um
D. Jo
Sangdoo Yun
J. Choi
NoLa
26
0
0
14 Jun 2021
Towards the Memorization Effect of Neural Networks in Adversarial Training
Han Xu
Xiaorui Liu
Wentao Wang
Wenbiao Ding
Zhongqin Wu
Zitao Liu
Anil K. Jain
Jiliang Tang
TDI
AAML
24
6
0
09 Jun 2021
Neural Collapse Under MSE Loss: Proximity to and Dynamics on the Central Path
X. Y. Han
V. Papyan
D. Donoho
AAML
28
136
0
03 Jun 2021
Sample Selection with Uncertainty of Losses for Learning with Noisy Labels
Xiaobo Xia
Tongliang Liu
Bo Han
Biwei Huang
Jun Yu
Gang Niu
Masashi Sugiyama
NoLa
17
110
0
01 Jun 2021
Fit without fear: remarkable mathematical phenomena of deep learning through the prism of interpolation
M. Belkin
14
182
0
29 May 2021
Estimating Instance-dependent Bayes-label Transition Matrix using a Deep Neural Network
Shuo Yang
Erkun Yang
Bo Han
Yang Liu
Min Xu
Gang Niu
Tongliang Liu
NoLa
BDL
29
42
0
27 May 2021
Principal Components Bias in Over-parameterized Linear Models, and its Manifestation in Deep Neural Networks
Guy Hacohen
D. Weinshall
16
10
0
12 May 2021
RATT: Leveraging Unlabeled Data to Guarantee Generalization
Saurabh Garg
Sivaraman Balakrishnan
J. Zico Kolter
Zachary Chase Lipton
28
30
0
01 May 2021
Generalization Guarantees for Neural Architecture Search with Train-Validation Split
Samet Oymak
Mingchen Li
Mahdi Soltanolkotabi
AI4CE
OOD
36
13
0
29 Apr 2021
Memorisation versus Generalisation in Pre-trained Language Models
Michael Tänzer
Sebastian Ruder
Marek Rei
94
50
0
16 Apr 2021
The Impact of Activation Sparsity on Overfitting in Convolutional Neural Networks
Karim Huesmann
Luis Garcia Rodriguez
Lars Linsen
Benjamin Risse
19
3
0
13 Apr 2021
Contrast to Divide: Self-Supervised Pre-Training for Learning with Noisy Labels
Evgenii Zheltonozhskii
Chaim Baskin
A. Mendelson
A. Bronstein
Or Litany
SSL
33
92
0
25 Mar 2021
Low Dimensional Landscape Hypothesis is True: DNNs can be Trained in Tiny Subspaces
Tao Li
Lei Tan
Qinghua Tao
Yipeng Liu
Xiaolin Huang
37
10
0
20 Mar 2021
Asymptotics of Ridge Regression in Convolutional Models
Mojtaba Sahraee-Ardakan
Tung Mai
Anup B. Rao
Ryan Rossi
S. Rangan
A. Fletcher
MLT
16
2
0
08 Mar 2021
Convolutional Normalization: Improving Deep Convolutional Network Robustness and Training
Sheng Liu
Xiao Li
Yuexiang Zhai
Chong You
Zhihui Zhu
C. Fernandez‐Granda
Qing Qu
17
25
0
01 Mar 2021
Multiplicative Reweighting for Robust Neural Network Optimization
Noga Bar
Tomer Koren
Raja Giryes
OOD
NoLa
13
9
0
24 Feb 2021
FINE Samples for Learning with Noisy Labels
Taehyeon Kim
Jongwoo Ko
Sangwook Cho
J. Choi
Se-Young Yun
NoLa
30
103
0
23 Feb 2021
Provable Super-Convergence with a Large Cyclical Learning Rate
Samet Oymak
33
12
0
22 Feb 2021
Learning to Combat Noisy Labels via Classification Margins
Jason Lin
Jelena Bradic
NoLa
34
7
0
01 Feb 2021
Self-Adaptive Training: Bridging Supervised and Self-Supervised Learning
Lang Huang
Chaoning Zhang
Hongyang R. Zhang
SSL
33
24
0
21 Jan 2021
Phases of learning dynamics in artificial neural networks: with or without mislabeled data
Yu Feng
Y. Tu
25
2
0
16 Jan 2021
Provable Generalization of SGD-trained Neural Networks of Any Width in the Presence of Adversarial Label Noise
Spencer Frei
Yuan Cao
Quanquan Gu
FedML
MLT
64
19
0
04 Jan 2021
Advances in Electron Microscopy with Deep Learning
Jeffrey M. Ede
32
2
0
04 Jan 2021
Identifying Training Stop Point with Noisy Labeled Data
Sree Ram Kamabattula
V. Devarajan
Babak Namazi
G. Sankaranarayanan
NoLa
8
2
0
24 Dec 2020
Semi-supervised novelty detection using ensembles with regularized disagreement
A. Tifrea
E. Stavarache
Fanny Yang
UQCV
29
6
0
10 Dec 2020
Robust Learning by Self-Transition for Handling Noisy Labels
Hwanjun Song
Minseok Kim
Dongmin Park
Yooju Shin
Jae-Gil Lee
NoLa
13
40
0
08 Dec 2020
KNN-enhanced Deep Learning Against Noisy Labels
Shuyu Kong
You Li
Jia Wang
Amin Rezaei
H. Zhou
NoLa
11
5
0
08 Dec 2020
Statistical Mechanics of Deep Linear Neural Networks: The Back-Propagating Kernel Renormalization
Qianyi Li
H. Sompolinsky
16
69
0
07 Dec 2020
Coresets for Robust Training of Neural Networks against Noisy Labels
Baharan Mirzasoleiman
Kaidi Cao
J. Leskovec
NoLa
11
32
0
15 Nov 2020
A Survey of Label-noise Representation Learning: Past, Present and Future
Bo Han
Quanming Yao
Tongliang Liu
Gang Niu
Ivor W. Tsang
James T. Kwok
Masashi Sugiyama
NoLa
24
158
0
09 Nov 2020
Deep Transfer Learning for Automated Diagnosis of Skin Lesions from Photographs
Emma Rocheteau
Doyoon Kim
MedIm
8
3
0
06 Nov 2020
On Convergence and Generalization of Dropout Training
Poorya Mianjy
R. Arora
29
30
0
23 Oct 2020
Review: Deep Learning in Electron Microscopy
Jeffrey M. Ede
34
79
0
17 Sep 2020
Early Stopping in Deep Networks: Double Descent and How to Eliminate it
Reinhard Heckel
Fatih Yilmaz
26
43
0
20 Jul 2020
Learning from Noisy Labels with Deep Neural Networks: A Survey
Hwanjun Song
Minseok Kim
Dongmin Park
Yooju Shin
Jae-Gil Lee
NoLa
24
960
0
16 Jul 2020
How benign is benign overfitting?
Amartya Sanyal
P. Dokania
Varun Kanade
Philip H. S. Torr
NoLa
AAML
23
57
0
08 Jul 2020
Early-Learning Regularization Prevents Memorization of Noisy Labels
Sheng Liu
Jonathan Niles-Weed
N. Razavian
C. Fernandez‐Granda
NoLa
8
553
0
30 Jun 2020
Previous
1
2
3
4
Next