Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1708.04106
Cited By
Rocket Launching: A Universal and Efficient Framework for Training Well-performing Light Net
14 August 2017
Guorui Zhou
Ying Fan
Runpeng Cui
Weijie Bian
Xiaoqiang Zhu
Kun Gai
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Rocket Launching: A Universal and Efficient Framework for Training Well-performing Light Net"
18 / 18 papers shown
Title
Teacher-Student Architecture for Knowledge Distillation: A Survey
Chengming Hu
Xuan Li
Danyang Liu
Haolun Wu
Xi Chen
Ju Wang
Xue Liu
23
16
0
08 Aug 2023
Knowledge Distillation from Single to Multi Labels: an Empirical Study
Youcai Zhang
Yuzhuo Qin
Heng-Ye Liu
Yanhao Zhang
Yaqian Li
X. Gu
VLM
55
2
0
15 Mar 2023
Audio Representation Learning by Distilling Video as Privileged Information
Amirhossein Hajavi
Ali Etemad
21
4
0
06 Feb 2023
Supervision Complexity and its Role in Knowledge Distillation
Hrayr Harutyunyan
A. S. Rawat
A. Menon
Seungyeon Kim
Surinder Kumar
32
12
0
28 Jan 2023
Directed Acyclic Graph Factorization Machines for CTR Prediction via Knowledge Distillation
Zhen Tian
Ting Bai
Ziyan Zhang
Zhiyuan Xu
Kangyi Lin
Ji-Rong Wen
Wayne Xin Zhao
29
18
0
21 Nov 2022
Designing and Training of Lightweight Neural Networks on Edge Devices using Early Halting in Knowledge Distillation
Rahul Mishra
Hari Prabhat Gupta
45
8
0
30 Sep 2022
Multi scale Feature Extraction and Fusion for Online Knowledge Distillation
Panpan Zou
Yinglei Teng
Tao Niu
37
3
0
16 Jun 2022
Knowledge Distillation Meets Open-Set Semi-Supervised Learning
Jing Yang
Xiatian Zhu
Adrian Bulat
Brais Martínez
Georgios Tzimiropoulos
37
8
0
13 May 2022
2D Human Pose Estimation: A Survey
Haoming Chen
Runyang Feng
Sifan Wu
Hao Xu
F. Zhou
Zhenguang Liu
3DH
27
55
0
15 Apr 2022
Learning Efficient Vision Transformers via Fine-Grained Manifold Distillation
Zhiwei Hao
Jianyuan Guo
Ding Jia
Kai Han
Yehui Tang
Chao Zhang
Dacheng Tao
Yunhe Wang
ViT
33
68
0
03 Jul 2021
Teacher's pet: understanding and mitigating biases in distillation
Michal Lukasik
Srinadh Bhojanapalli
A. Menon
Sanjiv Kumar
18
25
0
19 Jun 2021
Privileged Graph Distillation for Cold Start Recommendation
Shuai Wang
Kun Zhang
Le Wu
Haiping Ma
Richang Hong
Meng Wang
12
28
0
31 May 2021
DCAF: A Dynamic Computation Allocation Framework for Online Serving System
Biye Jiang
Pengye Zhang
Rihan Chen
Binding Dai
Xinchen Luo
Yifan Yang
Guan Wang
Guorui Zhou
Xiaoqiang Zhu
Kun Gai
14
15
0
17 Jun 2020
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
23
2,851
0
09 Jun 2020
Knowledge distillation via adaptive instance normalization
Jing Yang
Brais Martínez
Adrian Bulat
Georgios Tzimiropoulos
21
23
0
09 Mar 2020
Privileged Features Distillation at Taobao Recommendations
Chen Xu
Quan Li
Junfeng Ge
Jinyang Gao
Xiaoyong Yang
Changhua Pei
Fei Sun
Jian Wu
Hanxiao Sun
Wenwu Ou
15
67
0
11 Jul 2019
Low-resolution Face Recognition in the Wild via Selective Knowledge Distillation
Shiming Ge
Shengwei Zhao
Chenyu Li
Jia Li
CVBM
25
188
0
25 Nov 2018
Deep Interest Evolution Network for Click-Through Rate Prediction
Guorui Zhou
Na Mou
Ying Fan
Qi Pi
Weijie Bian
Chang Zhou
Xiaoqiang Zhu
Kun Gai
33
1,047
0
11 Sep 2018
1