Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1412.6550
Cited By
FitNets: Hints for Thin Deep Nets
19 December 2014
Adriana Romero
Nicolas Ballas
Samira Ebrahimi Kahou
Antoine Chassang
C. Gatta
Yoshua Bengio
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"FitNets: Hints for Thin Deep Nets"
50 / 747 papers shown
Title
Teach me how to Interpolate a Myriad of Embeddings
Shashanka Venkataramanan
Ewa Kijak
Laurent Amsaleg
Yannis Avrithis
50
2
0
29 Jun 2022
Cut Inner Layers: A Structured Pruning Strategy for Efficient U-Net GANs
Bo-Kyeong Kim
Shinkook Choi
Hancheol Park
21
4
0
29 Jun 2022
Knowledge Distillation of Transformer-based Language Models Revisited
Chengqiang Lu
Jianwei Zhang
Yunfei Chu
Zhengyu Chen
Jingren Zhou
Fei Wu
Haiqing Chen
Hongxia Yang
VLM
27
10
0
29 Jun 2022
Revisiting Architecture-aware Knowledge Distillation: Smaller Models and Faster Search
Taehyeon Kim
Heesoo Myeong
Se-Young Yun
37
2
0
27 Jun 2022
MetaFed: Federated Learning among Federations with Cyclic Knowledge Distillation for Personalized Healthcare
Yiqiang Chen
Wang Lu
Xin Qin
Jindong Wang
Xing Xie
FedML
43
58
0
17 Jun 2022
Multi scale Feature Extraction and Fusion for Online Knowledge Distillation
Panpan Zou
Yinglei Teng
Tao Niu
37
3
0
16 Jun 2022
Confidence-aware Self-Semantic Distillation on Knowledge Graph Embedding
Yichen Liu
C. Wang
Defang Chen
Zhehui Zhou
Yan Feng
Chun-Yen Chen
19
0
0
07 Jun 2022
Evaluation-oriented Knowledge Distillation for Deep Face Recognition
Yanhua Huang
Jiaxiang Wu
Xingkun Xu
Shouhong Ding
CVBM
28
32
0
06 Jun 2022
Point-to-Voxel Knowledge Distillation for LiDAR Semantic Segmentation
Yuenan Hou
Xinge Zhu
Yuexin Ma
Chen Change Loy
Yikang Li
3DPC
46
158
0
05 Jun 2022
Rethinking the Augmentation Module in Contrastive Learning: Learning Hierarchical Augmentation Invariance with Expanded Views
Junbo Zhang
Kaisheng Ma
30
43
0
01 Jun 2022
Towards Efficient 3D Object Detection with Knowledge Distillation
Jihan Yang
Shaoshuai Shi
Runyu Ding
Zhe Wang
Xiaojuan Qi
117
47
0
30 May 2022
Parameter-Efficient and Student-Friendly Knowledge Distillation
Jun Rao
Xv Meng
Liang Ding
Shuhan Qi
Dacheng Tao
37
46
0
28 May 2022
A Closer Look at Self-Supervised Lightweight Vision Transformers
Shaoru Wang
Jin Gao
Zeming Li
Jian Sun
Weiming Hu
ViT
73
42
0
28 May 2022
Fast Object Placement Assessment
Li Niu
Qingyang Liu
Zhenchen Liu
Jiangtong Li
21
14
0
28 May 2022
Embedding Principle in Depth for the Loss Landscape Analysis of Deep Neural Networks
Zhiwei Bai
Yaoyu Zhang
Z. Xu
Tao Luo
31
5
0
26 May 2022
IDEAL: Query-Efficient Data-Free Learning from Black-box Models
Jie M. Zhang
Chen Chen
Lingjuan Lyu
55
14
0
23 May 2022
PointDistiller: Structured Knowledge Distillation Towards Efficient and Compact 3D Detection
Linfeng Zhang
Runpei Dong
Hung-Shuo Tai
Kaisheng Ma
3DPC
72
47
0
23 May 2022
Knowledge Distillation via the Target-aware Transformer
Sihao Lin
Hongwei Xie
Bing Wang
Kaicheng Yu
Xiaojun Chang
Xiaodan Liang
G. Wang
ViT
20
104
0
22 May 2022
Knowledge Distillation from A Stronger Teacher
Tao Huang
Shan You
Fei Wang
Chao Qian
Chang Xu
35
238
0
21 May 2022
[Re] Distilling Knowledge via Knowledge Review
Apoorva Verma
Pranjal Gulati
Sarthak Gupta
VLM
24
0
0
18 May 2022
Knowledge Distillation Meets Open-Set Semi-Supervised Learning
Jing Yang
Xiatian Zhu
Adrian Bulat
Brais Martínez
Georgios Tzimiropoulos
40
8
0
13 May 2022
Contrastive Supervised Distillation for Continual Representation Learning
Tommaso Barletti
Niccoló Biondi
F. Pernici
Matteo Bruni
A. Bimbo
CLL
37
7
0
11 May 2022
Generalized Knowledge Distillation via Relationship Matching
Han-Jia Ye
Su Lu
De-Chuan Zhan
FedML
22
20
0
04 May 2022
Masked Generative Distillation
Zhendong Yang
Zhe Li
Mingqi Shao
Dachuan Shi
Zehuan Yuan
Chun Yuan
FedML
38
169
0
03 May 2022
Multiple Degradation and Reconstruction Network for Single Image Denoising via Knowledge Distillation
Juncheng Li
Hanhui Yang
Qiaosi Yi
Faming Fang
Guangwei Gao
T. Zeng
Guixu Zhang
30
7
0
29 Apr 2022
A Closer Look at Branch Classifiers of Multi-exit Architectures
Shaohui Lin
Bo Ji
Rongrong Ji
Angela Yao
14
4
0
28 Apr 2022
HRPose: Real-Time High-Resolution 6D Pose Estimation Network Using Knowledge Distillation
Qingze Guan
Zihao Sheng
Shibei Xue
3DH
26
15
0
20 Apr 2022
Empirical Evaluation and Theoretical Analysis for Representation Learning: A Survey
Kento Nozawa
Issei Sato
AI4TS
29
4
0
18 Apr 2022
MoEBERT: from BERT to Mixture-of-Experts via Importance-Guided Adaptation
Simiao Zuo
Qingru Zhang
Chen Liang
Pengcheng He
T. Zhao
Weizhu Chen
MoE
30
38
0
15 Apr 2022
2D Human Pose Estimation: A Survey
Haoming Chen
Runyang Feng
Sifan Wu
Hao Xu
F. Zhou
Zhenguang Liu
3DH
29
55
0
15 Apr 2022
Cross-Image Relational Knowledge Distillation for Semantic Segmentation
Chuanguang Yang
Helong Zhou
Zhulin An
Xue Jiang
Yong Xu
Qian Zhang
37
170
0
14 Apr 2022
Spatial Likelihood Voting with Self-Knowledge Distillation for Weakly Supervised Object Detection
Ze Chen
Zhihang Fu
Jianqiang Huang
Mingyuan Tao
Rongxin Jiang
Xiang Tian
Yao-wu Chen
Xiansheng Hua
WSOD
22
4
0
14 Apr 2022
Localization Distillation for Object Detection
Zhaohui Zheng
Rongguang Ye
Ping Wang
Dongwei Ren
Jun Wang
W. Zuo
Ming-Ming Cheng
32
64
0
12 Apr 2022
CoupleFace: Relation Matters for Face Recognition Distillation
Jiaheng Liu
Haoyu Qin
Yichao Wu
Jinyang Guo
Ding Liang
Ke Xu
CVBM
36
19
0
12 Apr 2022
Robust Cross-Modal Representation Learning with Progressive Self-Distillation
A. Andonian
Shixing Chen
Raffay Hamid
VLM
34
54
0
10 Apr 2022
CD
2
^2
2
-pFed: Cyclic Distillation-guided Channel Decoupling for Model Personalization in Federated Learning
Yiqing Shen
Yuyin Zhou
Lequan Yu
OOD
24
55
0
08 Apr 2022
Universal Representations: A Unified Look at Multiple Task and Domain Learning
Wei-Hong Li
Xialei Liu
Hakan Bilen
SSL
OOD
30
27
0
06 Apr 2022
Non-Local Latent Relation Distillation for Self-Adaptive 3D Human Pose Estimation
Jogendra Nath Kundu
Siddharth Seth
Anirudh Gururaj Jamkhandi
Pradyumna
Varun Jampani
Anirban Chakraborty
R. Venkatesh Babu
3DH
26
9
0
05 Apr 2022
Class-Incremental Learning by Knowledge Distillation with Adaptive Feature Consolidation
Minsoo Kang
Jaeyoo Park
Bohyung Han
CLL
27
179
0
02 Apr 2022
R2L: Distilling Neural Radiance Field to Neural Light Field for Efficient Novel View Synthesis
Huan Wang
Jian Ren
Zeng Huang
Kyle Olszewski
Menglei Chai
Yun Fu
Sergey Tulyakov
42
80
0
31 Mar 2022
Image-to-Lidar Self-Supervised Distillation for Autonomous Driving Data
Corentin Sautier
Gilles Puy
Spyros Gidaris
Alexandre Boulch
Andrei Bursuc
Renaud Marlet
3DPC
26
117
0
30 Mar 2022
Self-Distillation from the Last Mini-Batch for Consistency Regularization
Yiqing Shen
Liwu Xu
Yuzhe Yang
Yaqian Li
Yandong Guo
27
62
0
30 Mar 2022
LiDAR Distillation: Bridging the Beam-Induced Domain Gap for 3D Object Detection
Yi Wei
Zibu Wei
Yongming Rao
Jiaxin Li
Jie Zhou
Jiwen Lu
58
63
0
28 Mar 2022
Knowledge Distillation with the Reused Teacher Classifier
Defang Chen
Jianhan Mei
Hailin Zhang
C. Wang
Yan Feng
Chun-Yen Chen
36
167
0
26 Mar 2022
A Cross-Domain Approach for Continuous Impression Recognition from Dyadic Audio-Visual-Physio Signals
Yuanchao Li
Catherine Lai
24
1
0
25 Mar 2022
Class-Incremental Learning for Action Recognition in Videos
Jaeyoo Park
Minsoo Kang
Bohyung Han
CLL
24
52
0
25 Mar 2022
R-DFCIL: Relation-Guided Representation Learning for Data-Free Class Incremental Learning
Qiankun Gao
Chen Zhao
Guohao Li
Jian Zhang
CLL
28
61
0
24 Mar 2022
Channel Self-Supervision for Online Knowledge Distillation
Shixi Fan
Xuan Cheng
Xiaomin Wang
Chun Yang
Pan Deng
Minghui Liu
Jiali Deng
Meilin Liu
18
1
0
22 Mar 2022
SSD-KD: A Self-supervised Diverse Knowledge Distillation Method for Lightweight Skin Lesion Classification Using Dermoscopic Images
Yongwei Wang
Yuheng Wang
Tim K. Lee
Chunyan Miao
Z. J. Wang
28
74
0
22 Mar 2022
Open-Vocabulary One-Stage Detection with Hierarchical Visual-Language Knowledge Distillation
Zongyang Ma
Guan Luo
Jin Gao
Liang Li
Yuxin Chen
Shaoru Wang
Congxuan Zhang
Weiming Hu
VLM
ObjD
84
81
0
20 Mar 2022
Previous
1
2
3
...
5
6
7
...
13
14
15
Next