ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2106.15367
  4. Cited By
MAML is a Noisy Contrastive Learner in Classification

MAML is a Noisy Contrastive Learner in Classification

29 June 2021
Chia-Hsiang Kao
Wei-Chen Chiu
Pin-Yu Chen
ArXivPDFHTML

Papers citing "MAML is a Noisy Contrastive Learner in Classification"

10 / 10 papers shown
Title
Learn To Learn More Precisely
Learn To Learn More Precisely
Runxi Cheng
Yongxian Wei
Xianglong He
Wanyun Zhu
Songsong Huang
Fei Richard Yu
Fei Ma
Chun Yuan
41
0
0
08 Aug 2024
Meta-GPS++: Enhancing Graph Meta-Learning with Contrastive Learning and
  Self-Training
Meta-GPS++: Enhancing Graph Meta-Learning with Contrastive Learning and Self-Training
Y. Liu
Mengyu Li
Ximing Li
Lan Huang
Fausto Giunchiglia
Yanchun Liang
Xiaoyue Feng
Renchu Guan
SSL
43
4
0
20 Jul 2024
"Define Your Terms" : Enhancing Efficient Offensive Speech
  Classification with Definition
"Define Your Terms" : Enhancing Efficient Offensive Speech Classification with Definition
H. Nghiem
Umang Gupta
Fred Morstatter
25
4
0
05 Feb 2024
Plug-and-Play Feature Generation for Few-Shot Medical Image
  Classification
Plug-and-Play Feature Generation for Few-Shot Medical Image Classification
Qianyu Guo
Huifang Du
Xing Jia
Shuyong Gao
Yan Teng
Haofen Wang
Wenqiang Zhang
MedIm
VLM
19
0
0
14 Oct 2023
BatMan-CLR: Making Few-shots Meta-Learners Resilient Against Label Noise
BatMan-CLR: Making Few-shots Meta-Learners Resilient Against Label Noise
Jeroen Galjaard
Robert Birke
Juan F. Pérez
Lydia Y. Chen
NoLa
19
0
0
12 Sep 2023
Unleash Model Potential: Bootstrapped Meta Self-supervised Learning
Unleash Model Potential: Bootstrapped Meta Self-supervised Learning
Wenwen Qiang
Changwen Zheng
Jingyao Wang
Changwen Zheng
SSL
17
1
0
28 Aug 2023
Provable Multi-Task Representation Learning by Two-Layer ReLU Neural
  Networks
Provable Multi-Task Representation Learning by Two-Layer ReLU Neural Networks
Liam Collins
Hamed Hassani
Mahdi Soltanolkotabi
Aryan Mokhtari
Sanjay Shakkottai
36
10
0
13 Jul 2023
Learning an Explicit Hyperparameter Prediction Function Conditioned on
  Tasks
Learning an Explicit Hyperparameter Prediction Function Conditioned on Tasks
Jun Shu
Deyu Meng
Zongben Xu
26
6
0
06 Jul 2021
Rapid Learning or Feature Reuse? Towards Understanding the Effectiveness
  of MAML
Rapid Learning or Feature Reuse? Towards Understanding the Effectiveness of MAML
Aniruddh Raghu
M. Raghu
Samy Bengio
Oriol Vinyals
177
639
0
19 Sep 2019
Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
Chelsea Finn
Pieter Abbeel
Sergey Levine
OOD
338
11,684
0
09 Mar 2017
1