ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 2202.11384
  4. Cited By
Multi-Teacher Knowledge Distillation for Incremental Implicitly-Refined
  Classification

Multi-Teacher Knowledge Distillation for Incremental Implicitly-Refined Classification

23 February 2022
L. Yu
Zhenyu Weng
Yuqing Wang
Yuesheng Zhu
    CLL
ArXivPDFHTML

Papers citing "Multi-Teacher Knowledge Distillation for Incremental Implicitly-Refined Classification"

4 / 4 papers shown
Title
Dual Branch Network Towards Accurate Printed Mathematical Expression
  Recognition
Dual Branch Network Towards Accurate Printed Mathematical Expression Recognition
Yuqing Wang
Zhenyu Weng
Zhaokun Zhou
Shuaijian Ji
Zhongjie Ye
Yuesheng Zhu
32
2
0
14 Dec 2023
Fantastic Gains and Where to Find Them: On the Existence and Prospect of
  General Knowledge Transfer between Any Pretrained Model
Fantastic Gains and Where to Find Them: On the Existence and Prospect of General Knowledge Transfer between Any Pretrained Model
Karsten Roth
Lukas Thede
Almut Sophia Koepke
Oriol Vinyals
Olivier J. Hénaff
Zeynep Akata
AAML
30
12
0
26 Oct 2023
Dual-Curriculum Teacher for Domain-Inconsistent Object Detection in
  Autonomous Driving
Dual-Curriculum Teacher for Domain-Inconsistent Object Detection in Autonomous Driving
L. Yu
Yifan Zhang
Lanqing Hong
Fei Chen
Zhenguo Li
45
3
0
17 Oct 2022
Distilling Causal Effect of Data in Class-Incremental Learning
Distilling Causal Effect of Data in Class-Incremental Learning
Xinting Hu
Kaihua Tang
Chunyan Miao
Xiansheng Hua
Hanwang Zhang
CML
176
175
0
02 Mar 2021
1