Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2012.03236
Cited By
Cross-Layer Distillation with Semantic Calibration
6 December 2020
Defang Chen
Jian-Ping Mei
Yuan Zhang
Can Wang
Yan Feng
Chun-Yen Chen
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Cross-Layer Distillation with Semantic Calibration"
18 / 118 papers shown
Title
Attention-based Cross-Layer Domain Alignment for Unsupervised Domain Adaptation
Xu Ma
Junkun Yuan
Yen-wei Chen
Ruofeng Tong
Lanfen Lin
14
12
0
27 Feb 2022
Knowledge Distillation with Deep Supervision
Shiya Luo
Defang Chen
Can Wang
16
1
0
16 Feb 2022
KENN: Enhancing Deep Neural Networks by Leveraging Knowledge for Time Series Forecasting
M. A. Chattha
L. V. Elst
M. I. Malik
Andreas Dengel
Sheraz Ahmed
AI4TS
16
0
0
08 Feb 2022
Exploring Inter-Channel Correlation for Diversity-preserved KnowledgeDistillation
Li Liu
Qingle Huang
Sihao Lin
Hongwei Xie
Bing Wang
Xiaojun Chang
Xiao-Xue Liang
28
100
0
08 Feb 2022
Confidence-Aware Multi-Teacher Knowledge Distillation
Hailin Zhang
Defang Chen
Can Wang
12
64
0
30 Dec 2021
Improved Knowledge Distillation via Adversarial Collaboration
Zhiqiang Liu
Chengkai Huang
Yanxia Liu
21
2
0
29 Nov 2021
Semi-Online Knowledge Distillation
Zhiqiang Liu
Yanxia Liu
Chengkai Huang
15
5
0
23 Nov 2021
Edge-Cloud Polarization and Collaboration: A Comprehensive Survey for AI
Jiangchao Yao
Shengyu Zhang
Yang Yao
Feng Wang
Jianxin Ma
...
Kun Kuang
Chao-Xiang Wu
Fei Wu
Jingren Zhou
Hongxia Yang
18
91
0
11 Nov 2021
On Cross-Layer Alignment for Model Fusion of Heterogeneous Neural Networks
Dang Nguyen
T. Nguyen
Khai Nguyen
D.Q. Phung
Hung Bui
Nhat Ho
MoMe
16
9
0
29 Oct 2021
FedHe: Heterogeneous Models and Communication-Efficient Federated Learning
Chan Yun Hin
Edith C. H. Ngai
FedML
19
24
0
19 Oct 2021
Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution
Chuanguang Yang
Zhulin An
Linhang Cai
Yongjun Xu
22
15
0
07 Sep 2021
Multi-granularity for knowledge distillation
Baitan Shao
Ying Chen
22
3
0
15 Aug 2021
Distilling Holistic Knowledge with Graph Neural Networks
Sheng Zhou
Yucheng Wang
Defang Chen
Jiawei Chen
Xin Eric Wang
Can Wang
Jiajun Bu
15
54
0
12 Aug 2021
DisCo: Remedy Self-supervised Learning on Lightweight Models with Distilled Contrastive Learning
Yuting Gao
Jia-Xin Zhuang
Xiaowei Guo
Hao Cheng
Xing Sun
Ke Li
Feiyue Huang
33
40
0
19 Apr 2021
Distilling and Transferring Knowledge via cGAN-generated Samples for Image Classification and Regression
Xin Ding
Z. J. Wang
Zuheng Xu
Z. Jane Wang
William J. Welch
20
22
0
07 Apr 2021
Distilling a Powerful Student Model via Online Knowledge Distillation
Shaojie Li
Mingbao Lin
Yan Wang
Yongjian Wu
Yonghong Tian
Ling Shao
Rongrong Ji
FedML
25
46
0
26 Mar 2021
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
19
2,837
0
09 Jun 2020
Large scale distributed neural network training through online distillation
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
278
404
0
09 Apr 2018
Previous
1
2
3