ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2001.01536
  4. Cited By
Learning From Multiple Experts: Self-paced Knowledge Distillation for
  Long-tailed Classification

Learning From Multiple Experts: Self-paced Knowledge Distillation for Long-tailed Classification

6 January 2020
Liuyu Xiang
Guiguang Ding
Jungong Han
ArXivPDFHTML

Papers citing "Learning From Multiple Experts: Self-paced Knowledge Distillation for Long-tailed Classification"

3 / 153 papers shown
Title
Long-tailed Recognition by Routing Diverse Distribution-Aware Experts
Long-tailed Recognition by Routing Diverse Distribution-Aware Experts
Xudong Wang
Long Lian
Zhongqi Miao
Ziwei Liu
Stella X. Yu
30
379
0
05 Oct 2020
Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
Model-Agnostic Meta-Learning for Fast Adaptation of Deep Networks
Chelsea Finn
Pieter Abbeel
Sergey Levine
OOD
338
11,684
0
09 Mar 2017
Range Loss for Deep Face Recognition with Long-tail
Range Loss for Deep Face Recognition with Long-tail
Xiao Zhang
Zhiyuan Fang
Yandong Wen
Zhifeng Li
Yu Qiao
CVBM
237
446
0
28 Nov 2016
Previous
1234