ResearchTrend.AI
  • Papers
  • Communities
  • Events
  • Blog
  • Pricing
Papers
Communities
Social Events
Terms and Conditions
Pricing
Parameter LabParameter LabTwitterGitHubLinkedInBlueskyYoutube

© 2025 ResearchTrend.AI, All rights reserved.

  1. Home
  2. Papers
  3. 1904.06062
  4. Cited By
Unifying Heterogeneous Classifiers with Distillation

Unifying Heterogeneous Classifiers with Distillation

12 April 2019
J. Vongkulbhisal
Phongtharin Vinayavekhin
M. V. Scarzanella
ArXivPDFHTML

Papers citing "Unifying Heterogeneous Classifiers with Distillation"

11 / 11 papers shown
Title
Federated One-Shot Learning with Data Privacy and Objective-Hiding
Federated One-Shot Learning with Data Privacy and Objective-Hiding
Maximilian Egger
Rüdiger Urbanke
Rawad Bitar
FedML
63
0
0
29 Apr 2025
Swiss Army Knife: Synergizing Biases in Knowledge from Vision Foundation Models for Multi-Task Learning
Swiss Army Knife: Synergizing Biases in Knowledge from Vision Foundation Models for Multi-Task Learning
Yuxiang Lu
Shengcao Cao
Yu-xiong Wang
60
1
0
18 Oct 2024
Swing Distillation: A Privacy-Preserving Knowledge Distillation
  Framework
Swing Distillation: A Privacy-Preserving Knowledge Distillation Framework
Junzhuo Li
Xinwei Wu
Weilong Dong
Shuangzhi Wu
Chao Bian
Deyi Xiong
36
3
0
16 Dec 2022
Collaborative Multi-Teacher Knowledge Distillation for Learning Low
  Bit-width Deep Neural Networks
Collaborative Multi-Teacher Knowledge Distillation for Learning Low Bit-width Deep Neural Networks
Cuong Pham
Tuan Hoang
Thanh-Toan Do
FedML
MQ
40
14
0
27 Oct 2022
Distillation of Human-Object Interaction Contexts for Action Recognition
Distillation of Human-Object Interaction Contexts for Action Recognition
Muna Almushyti
Frederick W. Li
39
3
0
17 Dec 2021
Cross-lingual Transfer for Text Classification with Dictionary-based
  Heterogeneous Graph
Cross-lingual Transfer for Text Classification with Dictionary-based Heterogeneous Graph
Nuttapong Chairatanakul
Noppayut Sriwatanasakdi
Nontawat Charoenphakdee
Xin Liu
T. Murata
36
4
0
09 Sep 2021
Pool of Experts: Realtime Querying Specialized Knowledge in Massive
  Neural Networks
Pool of Experts: Realtime Querying Specialized Knowledge in Massive Neural Networks
Hakbin Kim
Dong-Wan Choi
25
2
0
03 Jul 2021
Adaptive Multi-Teacher Multi-level Knowledge Distillation
Adaptive Multi-Teacher Multi-level Knowledge Distillation
Yuang Liu
Wei Zhang
Jun Wang
28
157
0
06 Mar 2021
Group Knowledge Transfer: Federated Learning of Large CNNs at the Edge
Group Knowledge Transfer: Federated Learning of Large CNNs at the Edge
Chaoyang He
M. Annavaram
A. Avestimehr
FedML
32
23
0
28 Jul 2020
Knowledge Distillation: A Survey
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
28
2,857
0
09 Jun 2020
Unifying Specialist Image Embedding into Universal Image Embedding
Unifying Specialist Image Embedding into Universal Image Embedding
Yang Feng
Futang Peng
Xu-Yao Zhang
Wei-wei Zhu
Shanfeng Zhang
Howard Zhou
Zhen Li
Tom Duerig
Shih-Fu Chang
Jiebo Luo
SSL
32
5
0
08 Mar 2020
1