ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2104.08984
  4. Cited By
Contrastive Learning Improves Model Robustness Under Label Noise

Contrastive Learning Improves Model Robustness Under Label Noise

19 April 2021
Aritra Ghosh
Andrew Lan
    NoLa
ArXiv (abs)PDFHTMLGithub (32★)

Papers citing "Contrastive Learning Improves Model Robustness Under Label Noise"

33 / 33 papers shown
Title
Multi-level Supervised Contrastive Learning
Multi-level Supervised Contrastive Learning
Naghmeh Ghanooni
Barbod Pajoum
Harshit Rawal
Sophie Fellenz
Vo Nguyen Le Duy
Marius Kloft
193
0
0
04 Feb 2025
Do We Really Need Gold Samples for Sample Weighting Under Label Noise?
Do We Really Need Gold Samples for Sample Weighting Under Label Noise?
Aritra Ghosh
Andrew Lan
NoLa
70
9
0
19 Apr 2021
Contrast to Divide: Self-Supervised Pre-Training for Learning with Noisy
  Labels
Contrast to Divide: Self-Supervised Pre-Training for Learning with Noisy Labels
Evgenii Zheltonozhskii
Chaim Baskin
A. Mendelson
A. Bronstein
Or Litany
SSL
97
94
0
25 Mar 2021
Early-Learning Regularization Prevents Memorization of Noisy Labels
Early-Learning Regularization Prevents Memorization of Noisy Labels
Sheng Liu
Jonathan Niles-Weed
N. Razavian
C. Fernandez‐Granda
NoLa
104
569
0
30 Jun 2020
Normalized Loss Functions for Deep Learning with Noisy Labels
Normalized Loss Functions for Deep Learning with Noisy Labels
Xingjun Ma
Hanxun Huang
Yisen Wang
Simone Romano
S. Erfani
James Bailey
NoLa
76
445
0
24 Jun 2020
DivideMix: Learning with Noisy Labels as Semi-supervised Learning
DivideMix: Learning with Noisy Labels as Semi-supervised Learning
Junnan Li
R. Socher
Guosheng Lin
NoLa
110
1,034
0
18 Feb 2020
A Simple Framework for Contrastive Learning of Visual Representations
A Simple Framework for Contrastive Learning of Visual Representations
Ting-Li Chen
Simon Kornblith
Mohammad Norouzi
Geoffrey E. Hinton
SSL
390
18,897
0
13 Feb 2020
Peer Loss Functions: Learning from Noisy Labels without Knowing Noise
  Rates
Peer Loss Functions: Learning from Noisy Labels without Knowing Noise Rates
Yang Liu
Hongyi Guo
NoLa
76
242
0
08 Oct 2019
L_DMI: An Information-theoretic Noise-robust Loss Function
L_DMI: An Information-theoretic Noise-robust Loss Function
Yilun Xu
Peng Cao
Yuqing Kong
Yizhou Wang
NoLa
68
57
0
08 Sep 2019
Using Self-Supervised Learning Can Improve Model Robustness and
  Uncertainty
Using Self-Supervised Learning Can Improve Model Robustness and Uncertainty
Dan Hendrycks
Mantas Mazeika
Saurav Kadavath
Basel Alomair
OODSSL
60
950
0
28 Jun 2019
MixMatch: A Holistic Approach to Semi-Supervised Learning
MixMatch: A Holistic Approach to Semi-Supervised Learning
David Berthelot
Nicholas Carlini
Ian Goodfellow
Nicolas Papernot
Avital Oliver
Colin Raffel
159
3,033
0
06 May 2019
Unsupervised Label Noise Modeling and Loss Correction
Unsupervised Label Noise Modeling and Loss Correction
Eric Arazo Sanchez
Diego Ortego
Paul Albert
Noel E. O'Connor
Kevin McGuinness
NoLa
97
616
0
25 Apr 2019
Using Pre-Training Can Improve Model Robustness and Uncertainty
Using Pre-Training Can Improve Model Robustness and Uncertainty
Dan Hendrycks
Kimin Lee
Mantas Mazeika
NoLa
89
727
0
28 Jan 2019
Learning to Learn from Noisy Labeled Data
Learning to Learn from Noisy Labeled Data
Junnan Li
Yongkang Wong
Qi Zhao
Mohan Kankanhalli
NoLa
69
334
0
13 Dec 2018
Bilevel Programming for Hyperparameter Optimization and Meta-Learning
Bilevel Programming for Hyperparameter Optimization and Meta-Learning
Luca Franceschi
P. Frasconi
Saverio Salzo
Riccardo Grazzi
Massimiliano Pontil
179
732
0
13 Jun 2018
Masking: A New Perspective of Noisy Supervision
Masking: A New Perspective of Noisy Supervision
Bo Han
Jiangchao Yao
Gang Niu
Mingyuan Zhou
Ivor Tsang
Ya Zhang
Masashi Sugiyama
NoLa
78
255
0
21 May 2018
Generalized Cross Entropy Loss for Training Deep Neural Networks with
  Noisy Labels
Generalized Cross Entropy Loss for Training Deep Neural Networks with Noisy Labels
Zhilu Zhang
M. Sabuncu
NoLa
85
2,615
0
20 May 2018
Co-teaching: Robust Training of Deep Neural Networks with Extremely
  Noisy Labels
Co-teaching: Robust Training of Deep Neural Networks with Extremely Noisy Labels
Bo Han
Quanming Yao
Xingrui Yu
Gang Niu
Miao Xu
Weihua Hu
Ivor Tsang
Masashi Sugiyama
NoLa
123
2,082
0
18 Apr 2018
Joint Optimization Framework for Learning with Noisy Labels
Joint Optimization Framework for Learning with Noisy Labels
Daiki Tanaka
Daiki Ikami
T. Yamasaki
Kiyoharu Aizawa
NoLa
74
712
0
30 Mar 2018
Learning to Reweight Examples for Robust Deep Learning
Learning to Reweight Examples for Robust Deep Learning
Mengye Ren
Wenyuan Zeng
Binh Yang
R. Urtasun
OODNoLa
152
1,431
0
24 Mar 2018
Robust Loss Functions under Label Noise for Deep Neural Networks
Robust Loss Functions under Label Noise for Deep Neural Networks
Aritra Ghosh
Himanshu Kumar
P. Sastry
NoLaOOD
80
959
0
27 Dec 2017
MentorNet: Learning Data-Driven Curriculum for Very Deep Neural Networks
  on Corrupted Labels
MentorNet: Learning Data-Driven Curriculum for Very Deep Neural Networks on Corrupted Labels
Lu Jiang
Zhengyuan Zhou
Thomas Leung
Li Li
Li Fei-Fei
NoLa
131
1,456
0
14 Dec 2017
mixup: Beyond Empirical Risk Minimization
mixup: Beyond Empirical Risk Minimization
Hongyi Zhang
Moustapha Cissé
Yann N. Dauphin
David Lopez-Paz
NoLa
316
9,811
0
25 Oct 2017
Large Batch Training of Convolutional Networks
Large Batch Training of Convolutional Networks
Yang You
Igor Gitman
Boris Ginsburg
ODL
141
852
0
13 Aug 2017
A Downsampled Variant of ImageNet as an Alternative to the CIFAR
  datasets
A Downsampled Variant of ImageNet as an Alternative to the CIFAR datasets
P. Chrabaszcz
I. Loshchilov
Frank Hutter
SSegOOD
173
649
0
27 Jul 2017
Learning From Noisy Large-Scale Datasets With Minimal Supervision
Learning From Noisy Large-Scale Datasets With Minimal Supervision
Andreas Veit
N. Alldrin
Gal Chechik
Ivan Krasin
Abhinav Gupta
Serge J. Belongie
142
480
0
06 Jan 2017
Making Deep Neural Networks Robust to Label Noise: a Loss Correction
  Approach
Making Deep Neural Networks Robust to Label Noise: a Loss Correction Approach
Giorgio Patrini
A. Rozza
A. Menon
Richard Nock
Zhuang Li
NoLa
120
1,459
0
13 Sep 2016
Learning with Symmetric Label Noise: The Importance of Being Unhinged
Learning with Symmetric Label Noise: The Importance of Being Unhinged
Brendan van Rooyen
A. Menon
Robert C. Williamson
NoLa
172
313
0
28 May 2015
Webly Supervised Learning of Convolutional Networks
Webly Supervised Learning of Convolutional Networks
Xinlei Chen
Abhinav Gupta
SSL
90
373
0
07 May 2015
Training Deep Neural Networks on Noisy Labels with Bootstrapping
Training Deep Neural Networks on Noisy Labels with Bootstrapping
Scott E. Reed
Honglak Lee
Dragomir Anguelov
Christian Szegedy
D. Erhan
Andrew Rabinovich
NoLa
125
1,023
0
20 Dec 2014
Making Risk Minimization Tolerant to Label Noise
Making Risk Minimization Tolerant to Label Noise
Aritra Ghosh
Naresh Manwani
P. Sastry
NoLa
163
215
0
14 Mar 2014
Double Ramp Loss Based Reject Option Classifier
Double Ramp Loss Based Reject Option Classifier
Naresh Manwani
Aritra Ghosh
P. Sastry
Ramasubramanian Sundararajan
84
49
0
26 Nov 2013
Identifying Mislabeled Training Data
Identifying Mislabeled Training Data
C. Brodley
M. Friedl
111
972
0
01 Jun 2011
1