Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1412.6550
Cited By
FitNets: Hints for Thin Deep Nets
19 December 2014
Adriana Romero
Nicolas Ballas
Samira Ebrahimi Kahou
Antoine Chassang
C. Gatta
Yoshua Bengio
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"FitNets: Hints for Thin Deep Nets"
50 / 747 papers shown
Title
X-Learner: Learning Cross Sources and Tasks for Universal Visual Representation
Yinan He
Gengshi Huang
Siyu Chen
Jianing Teng
Wang Kun
Zhen-fei Yin
Lu Sheng
Ziwei Liu
Yu Qiao
Jing Shao
VLM
SSL
ViT
45
7
0
16 Mar 2022
Representation Compensation Networks for Continual Semantic Segmentation
Chang-Bin Zhang
Jianqiang Xiao
Xialei Liu
Ying-Cong Chen
Mingg-Ming Cheng
SSeg
CLL
53
93
0
10 Mar 2022
Knowledge Distillation as Efficient Pre-training: Faster Convergence, Higher Data-efficiency, and Better Transferability
Ruifei He
Shuyang Sun
Jihan Yang
Song Bai
Xiaojuan Qi
37
36
0
10 Mar 2022
How many Observations are Enough? Knowledge Distillation for Trajectory Forecasting
Alessio Monti
Angelo Porrello
Simone Calderara
Pasquale Coscia
Lamberto Ballan
Rita Cucchiara
23
48
0
09 Mar 2022
MSDN: Mutually Semantic Distillation Network for Zero-Shot Learning
Shiming Chen
Ziming Hong
Guosen Xie
Wenhan Wang
Qinmu Peng
Kai Wang
Jian-jun Zhao
Xinge You
VLM
23
100
0
07 Mar 2022
Extracting Effective Subnetworks with Gumbel-Softmax
Robin Dupont
M. Alaoui
H. Sahbi
A. Lebois
22
6
0
25 Feb 2022
Learn From the Past: Experience Ensemble Knowledge Distillation
Chaofei Wang
Shaowei Zhang
S. Song
Gao Huang
35
4
0
25 Feb 2022
Efficient Video Segmentation Models with Per-frame Inference
Yifan Liu
Chunhua Shen
Changqian Yu
Jingdong Wang
35
0
0
24 Feb 2022
HRel: Filter Pruning based on High Relevance between Activation Maps and Class Labels
C. Sarvani
Mrinmoy Ghorai
S. Dubey
S. H. Shabbeer Basha
VLM
39
37
0
22 Feb 2022
Meta Knowledge Distillation
Jihao Liu
Boxiao Liu
Hongsheng Li
Yu Liu
18
25
0
16 Feb 2022
Exploring Inter-Channel Correlation for Diversity-preserved KnowledgeDistillation
Li Liu
Qingle Huang
Sihao Lin
Hongwei Xie
Bing Wang
Xiaojun Chang
Xiao-Xue Liang
28
100
0
08 Feb 2022
Learning Representation from Neural Fisher Kernel with Low-rank Approximation
Ruixiang Zhang
Shuangfei Zhai
Etai Littwin
J. Susskind
SSL
36
3
0
04 Feb 2022
Auto-Transfer: Learning to Route Transferrable Representations
K. Murugesan
Vijay Sadashivaiah
Ronny Luss
Karthikeyan Shanmugam
Pin-Yu Chen
Amit Dhurandhar
AAML
49
5
0
02 Feb 2022
Deconfounded Representation Similarity for Comparison of Neural Networks
Tianyu Cui
Yogesh Kumar
Pekka Marttinen
Samuel Kaski
CML
35
13
0
31 Jan 2022
AutoDistil: Few-shot Task-agnostic Neural Architecture Search for Distilling Large Language Models
Dongkuan Xu
Subhabrata Mukherjee
Xiaodong Liu
Debadeepta Dey
Wenhui Wang
Xiang Zhang
Ahmed Hassan Awadallah
Jianfeng Gao
33
4
0
29 Jan 2022
Dynamic Rectification Knowledge Distillation
Fahad Rahman Amik
Ahnaf Ismat Tasin
Silvia Ahmed
M. M. L. Elahi
Nabeel Mohammed
31
5
0
27 Jan 2022
Adaptive Instance Distillation for Object Detection in Autonomous Driving
Qizhen Lan
Qing Tian
33
7
0
26 Jan 2022
Enabling Deep Learning on Edge Devices through Filter Pruning and Knowledge Transfer
Kaiqi Zhao
Yitao Chen
Ming Zhao
27
3
0
22 Jan 2022
It's All in the Head: Representation Knowledge Distillation through Classifier Sharing
Emanuel Ben-Baruch
M. Karklinsky
Yossi Biton
Avi Ben-Cohen
Hussam Lawen
Nadav Zamir
29
11
0
18 Jan 2022
STURE: Spatial-Temporal Mutual Representation Learning for Robust Data Association in Online Multi-Object Tracking
Haidong Wang
Zhiyong Li
Yaping Li
Ke Nai
Ming Wen
VOT
28
7
0
18 Jan 2022
Cross-modal Contrastive Distillation for Instructional Activity Anticipation
Zhengyuan Yang
Jingen Liu
Jing-ling Huang
Xiaodong He
Tao Mei
Chenliang Xu
Jiebo Luo
31
6
0
18 Jan 2022
Egeria: Efficient DNN Training with Knowledge-Guided Layer Freezing
Yiding Wang
D. Sun
Kai Chen
Fan Lai
Mosharaf Chowdhury
33
44
0
17 Jan 2022
SimReg: Regression as a Simple Yet Effective Tool for Self-supervised Knowledge Distillation
K. Navaneet
Soroush Abbasi Koohpayegani
Ajinkya Tejankar
Hamed Pirsiavash
23
19
0
13 Jan 2022
Data-Free Knowledge Transfer: A Survey
Yuang Liu
Wei Zhang
Jun Wang
Jianyong Wang
40
48
0
31 Dec 2021
A Multi-channel Training Method Boost the Performance
Yingdong Hu
19
1
0
27 Dec 2021
Multi-Modality Distillation via Learning the teacher's modality-level Gram Matrix
Peng Liu
24
0
0
21 Dec 2021
Incremental Cross-view Mutual Distillation for Self-supervised Medical CT Synthesis
Chaowei Fang
Liang Wang
Dingwen Zhang
Jun Xu
Yixuan Yuan
Junwei Han
OOD
32
13
0
20 Dec 2021
A Deep Knowledge Distillation framework for EEG assisted enhancement of single-lead ECG based sleep staging
Vaibhav Joshi
S. Vijayarangan
S. Preejith
M. Sivaprakasam
24
7
0
14 Dec 2021
Knowledge Distillation for Object Detection via Rank Mimicking and Prediction-guided Feature Imitation
Gang Li
Xiang Li
Yujie Wang
Shanshan Zhang
Yichao Wu
Ding Liang
ObjD
24
79
0
09 Dec 2021
ADD: Frequency Attention and Multi-View based Knowledge Distillation to Detect Low-Quality Compressed Deepfake Images
B. Le
Simon S. Woo
AAML
27
80
0
07 Dec 2021
Toward Practical Monocular Indoor Depth Estimation
Cho-Ying Wu
Jialiang Wang
Michael Hall
Ulrich Neumann
Shuochen Su
3DV
MDE
48
63
0
04 Dec 2021
The Augmented Image Prior: Distilling 1000 Classes by Extrapolating from a Single Image
Yuki M. Asano
Aaqib Saeed
50
7
0
01 Dec 2021
Information Theoretic Representation Distillation
Roy Miles
Adrian Lopez-Rodriguez
K. Mikolajczyk
MQ
26
21
0
01 Dec 2021
Mixed Precision Low-bit Quantization of Neural Network Language Models for Speech Recognition
Junhao Xu
Jianwei Yu
Shoukang Hu
Xunying Liu
Helen Meng
MQ
32
13
0
29 Nov 2021
Improved Knowledge Distillation via Adversarial Collaboration
Zhiqiang Liu
Chengkai Huang
Yanxia Liu
31
2
0
29 Nov 2021
ESGN: Efficient Stereo Geometry Network for Fast 3D Object Detection
Aqi Gao
Yanwei Pang
Jing Nie
Jiale Cao
Yishun Guo
3DPC
17
15
0
28 Nov 2021
Self-slimmed Vision Transformer
Zhuofan Zong
Kunchang Li
Guanglu Song
Yali Wang
Yu Qiao
B. Leng
Yu Liu
ViT
21
30
0
24 Nov 2021
EvDistill: Asynchronous Events to End-task Learning via Bidirectional Reconstruction-guided Cross-modal Knowledge Distillation
Lin Wang
Yujeong Chae
Sung-Hoon Yoon
Tae-Kyun Kim
Kuk-Jin Yoon
47
64
0
24 Nov 2021
Local-Selective Feature Distillation for Single Image Super-Resolution
Seonguk Park
Nojun Kwak
24
9
0
22 Nov 2021
Hierarchical Knowledge Distillation for Dialogue Sequence Labeling
Shota Orihashi
Yoshihiro Yamazaki
Naoki Makishima
Mana Ihori
Akihiko Takashima
Tomohiro Tanaka
Ryo Masumura
22
0
0
22 Nov 2021
Teacher-Student Training and Triplet Loss to Reduce the Effect of Drastic Face Occlusion
Mariana-Iuliana Georgescu
Georgian-Emilian Duta
Radu Tudor Ionescu
3DH
CVBM
30
19
0
20 Nov 2021
Robust and Accurate Object Detection via Self-Knowledge Distillation
Weipeng Xu
Pengzhi Chu
Renhao Xie
Xiongziyan Xiao
Hongcheng Huang
AAML
ObjD
27
4
0
14 Nov 2021
Learning Data Teaching Strategies Via Knowledge Tracing
Ghodai M. Abdelrahman
Qing Wang
24
12
0
13 Nov 2021
Facial Landmark Points Detection Using Knowledge Distillation-Based Neural Networks
A. P. Fard
Mohammad H. Mahoor
CVBM
31
28
0
13 Nov 2021
Meta-Teacher For Face Anti-Spoofing
Yunxiao Qin
Zitong Yu
Longbin Yan
Zezheng Wang
Chenxu Zhao
Zhen Lei
CVBM
25
61
0
12 Nov 2021
MixACM: Mixup-Based Robustness Transfer via Distillation of Activated Channel Maps
Muhammad Awais
Fengwei Zhou
Chuanlong Xie
Jiawei Li
Sung-Ho Bae
Zhenguo Li
AAML
43
17
0
09 Nov 2021
Cold Brew: Distilling Graph Node Representations with Incomplete or Missing Neighborhoods
Wenqing Zheng
Edward W. Huang
Nikhil S. Rao
S. Katariya
Zhangyang Wang
Karthik Subbian
34
62
0
08 Nov 2021
Oracle Teacher: Leveraging Target Information for Better Knowledge Distillation of CTC Models
J. Yoon
H. Kim
Hyeon Seung Lee
Sunghwan Ahn
N. Kim
43
1
0
05 Nov 2021
Multi-Glimpse Network: A Robust and Efficient Classification Architecture based on Recurrent Downsampled Attention
S. Tan
Runpei Dong
Kaisheng Ma
22
2
0
03 Nov 2021
Arch-Net: Model Distillation for Architecture Agnostic Model Deployment
Weixin Xu
Zipeng Feng
Shuangkang Fang
Song Yuan
Yi Yang
Shuchang Zhou
MQ
30
1
0
01 Nov 2021
Previous
1
2
3
...
6
7
8
...
13
14
15
Next