ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1903.03825
  4. Cited By
Interpolation Consistency Training for Semi-Supervised Learning

Interpolation Consistency Training for Semi-Supervised Learning

9 March 2019
Vikas Verma
Kenji Kawaguchi
Alex Lamb
Arno Solin
Arno Solin
Yoshua Bengio
David Lopez-Paz
ArXivPDFHTML

Papers citing "Interpolation Consistency Training for Semi-Supervised Learning"

22 / 322 papers shown
Title
CMTS: Conditional Multiple Trajectory Synthesizer for Generating
  Safety-critical Driving Scenarios
CMTS: Conditional Multiple Trajectory Synthesizer for Generating Safety-critical Driving Scenarios
Wenhao Ding
Mengdi Xu
Ding Zhao
27
52
0
17 Sep 2019
Snowball: Iterative Model Evolution and Confident Sample Discovery for
  Semi-Supervised Learning on Very Small Labeled Datasets
Snowball: Iterative Model Evolution and Confident Sample Discovery for Semi-Supervised Learning on Very Small Labeled Datasets
Yang Li
Jianhe Yuan
Zhiqun Zhao
Hao Sun
Zhihai He
8
6
0
04 Sep 2019
Semi-supervised Learning of Fetal Anatomy from Ultrasound
Semi-supervised Learning of Fetal Anatomy from Ultrasound
Jeremy Tan
Anselm Au
Qingjie Meng
Bernhard Kainz
17
10
0
30 Aug 2019
Pseudo-Labeling and Confirmation Bias in Deep Semi-Supervised Learning
Pseudo-Labeling and Confirmation Bias in Deep Semi-Supervised Learning
Eric Arazo
Diego Ortego
Paul Albert
Noel E. O'Connor
Kevin McGuinness
16
815
0
08 Aug 2019
Sound source detection, localization and classification using
  consecutive ensemble of CRNN models
Sound source detection, localization and classification using consecutive ensemble of CRNN models
Slawomir Kapka
M. Lewandowski
11
66
0
02 Aug 2019
InfoGraph: Unsupervised and Semi-supervised Graph-Level Representation
  Learning via Mutual Information Maximization
InfoGraph: Unsupervised and Semi-supervised Graph-Level Representation Learning via Mutual Information Maximization
Fan-Yun Sun
Jordan Hoffmann
Vikas Verma
Jian Tang
SSL
38
840
0
31 Jul 2019
HODGEPODGE: Sound event detection based on ensemble of semi-supervised
  learning methods
HODGEPODGE: Sound event detection based on ensemble of semi-supervised learning methods
Ziqiang Shi
Liu Liu
Huibin Lin
Rujie Liu
Anyan Shi
14
20
0
17 Jul 2019
Asymptotic Bayes risk for Gaussian mixture in a semi-supervised setting
Asymptotic Bayes risk for Gaussian mixture in a semi-supervised setting
Marc Lelarge
Léo Miolane
14
28
0
08 Jul 2019
Exploring Self-Supervised Regularization for Supervised and
  Semi-Supervised Learning
Exploring Self-Supervised Regularization for Supervised and Semi-Supervised Learning
Phi Vu Tran
SSL
6
16
0
25 Jun 2019
Energy Models for Better Pseudo-Labels: Improving Semi-Supervised
  Classification with the 1-Laplacian Graph Energy
Energy Models for Better Pseudo-Labels: Improving Semi-Supervised Classification with the 1-Laplacian Graph Energy
Angelica I. Aviles-Rivero
Nicolas Papadakis
Ruoteng Li
P. Sellars
Samar M. Alsaleh
R. Tan
Carola-Bibiane Schönlieb
24
3
0
20 Jun 2019
Interpolated Adversarial Training: Achieving Robust Neural Networks
  without Sacrificing Too Much Accuracy
Interpolated Adversarial Training: Achieving Robust Neural Networks without Sacrificing Too Much Accuracy
Alex Lamb
Vikas Verma
Kenji Kawaguchi
Alexander Matyasko
Savya Khosla
Arno Solin
Yoshua Bengio
AAML
30
98
0
16 Jun 2019
Selfie: Self-supervised Pretraining for Image Embedding
Selfie: Self-supervised Pretraining for Image Embedding
Trieu H. Trinh
Minh-Thang Luong
Quoc V. Le
SSL
11
111
0
07 Jun 2019
Semi-supervised semantic segmentation needs strong, varied perturbations
Semi-supervised semantic segmentation needs strong, varied perturbations
Geoff French
S. Laine
Timo Aila
Michal Mackiewicz
G. Finlayson
26
29
0
05 Jun 2019
Achieving Generalizable Robustness of Deep Neural Networks by Stability
  Training
Achieving Generalizable Robustness of Deep Neural Networks by Stability Training
Jan Laermann
Wojciech Samek
Nils Strodthoff
OOD
24
15
0
03 Jun 2019
Semi-Supervised Learning with Scarce Annotations
Semi-Supervised Learning with Scarce Annotations
Sylvestre-Alvise Rebuffi
Sébastien Ehrhardt
Kai Han
Andrea Vedaldi
Andrew Zisserman
SSL
21
49
0
21 May 2019
Virtual Mixup Training for Unsupervised Domain Adaptation
Virtual Mixup Training for Unsupervised Domain Adaptation
Xudong Mao
Yun Ma
Zhenguo Yang
Yangbin Chen
Qing Li
30
52
0
10 May 2019
S4L: Self-Supervised Semi-Supervised Learning
S4L: Self-Supervised Semi-Supervised Learning
Xiaohua Zhai
Avital Oliver
Alexander Kolesnikov
Lucas Beyer
SSL
VLM
29
786
0
09 May 2019
MixMatch: A Holistic Approach to Semi-Supervised Learning
MixMatch: A Holistic Approach to Semi-Supervised Learning
David Berthelot
Nicholas Carlini
Ian Goodfellow
Nicolas Papernot
Avital Oliver
Colin Raffel
17
2,985
0
06 May 2019
Unsupervised Data Augmentation for Consistency Training
Unsupervised Data Augmentation for Consistency Training
Qizhe Xie
Zihang Dai
Eduard H. Hovy
Minh-Thang Luong
Quoc V. Le
32
2,286
0
29 Apr 2019
On Adversarial Mixup Resynthesis
On Adversarial Mixup Resynthesis
Christopher Beckham
S. Honari
Vikas Verma
Alex Lamb
F. Ghadiri
R. Devon Hjelm
Yoshua Bengio
C. Pal
AAML
12
12
0
07 Mar 2019
There Are Many Consistent Explanations of Unlabeled Data: Why You Should
  Average
There Are Many Consistent Explanations of Unlabeled Data: Why You Should Average
Ben Athiwaratkun
Marc Finzi
Pavel Izmailov
A. Wilson
199
243
0
14 Jun 2018
Mean teachers are better role models: Weight-averaged consistency
  targets improve semi-supervised deep learning results
Mean teachers are better role models: Weight-averaged consistency targets improve semi-supervised deep learning results
Antti Tarvainen
Harri Valpola
OOD
MoMe
261
1,275
0
06 Mar 2017
Previous
1234567