Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1412.6550
Cited By
FitNets: Hints for Thin Deep Nets
19 December 2014
Adriana Romero
Nicolas Ballas
Samira Ebrahimi Kahou
Antoine Chassang
C. Gatta
Yoshua Bengio
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"FitNets: Hints for Thin Deep Nets"
50 / 747 papers shown
Title
CNN-based RGB-D Salient Object Detection: Learn, Select and Fuse
Hao Chen
Youfu Li
ObjD
24
23
0
20 Sep 2019
Hint-Based Training for Non-Autoregressive Machine Translation
Zhuohan Li
Zi Lin
Di He
Fei Tian
Tao Qin
Liwei Wang
Tie-Yan Liu
31
72
0
15 Sep 2019
Deep Elastic Networks with Model Selection for Multi-Task Learning
Chanho Ahn
Eunwoo Kim
Songhwai Oh
49
49
0
11 Sep 2019
Knowledge Transfer Graph for Deep Collaborative Learning
Soma Minami
Tsubasa Hirakawa
Takayoshi Yamashita
H. Fujiyoshi
30
9
0
10 Sep 2019
Extreme Low Resolution Activity Recognition with Confident Spatial-Temporal Attention Transfer
Yucai Bai
Qinglong Zou
Xieyuanli Chen
Lingxi Li
Zhengming Ding
Long Chen
20
3
0
09 Sep 2019
Knowledge Distillation for End-to-End Person Search
Bharti Munjal
Fabio Galasso
S. Amin
FedML
48
15
0
03 Sep 2019
Patient Knowledge Distillation for BERT Model Compression
S. Sun
Yu Cheng
Zhe Gan
Jingjing Liu
78
832
0
25 Aug 2019
Progressive Face Super-Resolution via Attention to Facial Landmark
Deok-Hun Kim
Minseon Kim
Gihyun Kwon
Daeshik Kim
SupR
CVBM
19
135
0
22 Aug 2019
MobileFAN: Transferring Deep Hidden Representation for Face Alignment
Yang Zhao
Yifan Liu
Chunhua Shen
Yongsheng Gao
Shengwu Xiong
CVBM
27
39
0
11 Aug 2019
Effective Training of Convolutional Neural Networks with Low-bitwidth Weights and Activations
Bohan Zhuang
Jing Liu
Mingkui Tan
Lingqiao Liu
Ian Reid
Chunhua Shen
MQ
29
45
0
10 Aug 2019
Self-Knowledge Distillation in Natural Language Processing
Sangchul Hahn
Heeyoul Choi
11
111
0
02 Aug 2019
Distilling Knowledge From a Deep Pose Regressor Network
Muhamad Risqi U. Saputra
Pedro Porto Buarque de Gusmão
Yasin Almalioglu
Andrew Markham
A. Trigoni
11
102
0
02 Aug 2019
Distilled Siamese Networks for Visual Tracking
Jianbing Shen
Yuanpei Liu
Xingping Dong
Xiankai Lu
Fahad Shahbaz Khan
Guosheng Lin
20
101
0
24 Jul 2019
Similarity-Preserving Knowledge Distillation
Frederick Tung
Greg Mori
45
961
0
23 Jul 2019
Privileged Features Distillation at Taobao Recommendations
Chen Xu
Quan Li
Junfeng Ge
Jinyang Gao
Xiaoyong Yang
Changhua Pei
Fei Sun
Jian Wu
Hanxiao Sun
Wenwu Ou
15
67
0
11 Jul 2019
Distill-2MD-MTL: Data Distillation based on Multi-Dataset Multi-Domain Multi-Task Frame Work to Solve Face Related Tasksks, Multi Task Learning, Semi-Supervised Learning
Sepidehsadat Hosseini
M. Shabani
N. Cho
CVBM
36
3
0
08 Jul 2019
GAN-Knowledge Distillation for one-stage Object Detection
Wanwei Wang
Jin ke Yu Fan Zong
ObjD
22
28
0
20 Jun 2019
Membership Privacy for Machine Learning Models Through Knowledge Transfer
Virat Shejwalkar
Amir Houmansadr
22
10
0
15 Jun 2019
Divide and Conquer: Leveraging Intermediate Feature Representations for Quantized Training of Neural Networks
Ahmed T. Elthakeb
Prannoy Pilligundla
Alex Cloninger
H. Esmaeilzadeh
MQ
26
8
0
14 Jun 2019
Distilling Object Detectors with Fine-grained Feature Imitation
Tao Wang
Li-xin Yuan
Xiaopeng Zhang
Jiashi Feng
ObjD
13
378
0
09 Jun 2019
DiCENet: Dimension-wise Convolutions for Efficient Networks
Sachin Mehta
Hannaneh Hajishirzi
Mohammad Rastegari
36
43
0
08 Jun 2019
Efficient Object Embedding for Spliced Image Retrieval
Bor-Chun Chen
Zuxuan Wu
L. Davis
Ser-Nam Lim
32
8
0
28 May 2019
Zero-shot Knowledge Transfer via Adversarial Belief Matching
P. Micaelli
Amos Storkey
19
228
0
23 May 2019
Lightweight Network Architecture for Real-Time Action Recognition
Alexander Kozlov
Vadim Andronov
Y. Gritsenko
ViT
25
33
0
21 May 2019
Be Your Own Teacher: Improve the Performance of Convolutional Neural Networks via Self Distillation
Linfeng Zhang
Jiebo Song
Anni Gao
Jingwei Chen
Chenglong Bao
Kaisheng Ma
FedML
27
846
0
17 May 2019
Dynamic Neural Network Channel Execution for Efficient Training
Simeon E. Spasov
Pietro Lio
19
4
0
15 May 2019
Learning What and Where to Transfer
Yunhun Jang
Hankook Lee
Sung Ju Hwang
Jinwoo Shin
22
149
0
15 May 2019
Object Detection in 20 Years: A Survey
Zhengxia Zou
Keyan Chen
Zhenwei Shi
Yuhong Guo
Jieping Ye
VLM
ObjD
AI4TS
32
2,290
0
13 May 2019
High Frequency Residual Learning for Multi-Scale Image Classification
Bowen Cheng
Rong Xiao
Jianfeng Wang
Thomas Huang
Lei Zhang
34
21
0
07 May 2019
Similarity of Neural Network Representations Revisited
Simon Kornblith
Mohammad Norouzi
Honglak Lee
Geoffrey E. Hinton
82
1,362
0
01 May 2019
TextKD-GAN: Text Generation using KnowledgeDistillation and Generative Adversarial Networks
Md. Akmal Haidar
Mehdi Rezagholizadeh
37
52
0
23 Apr 2019
Student Becoming the Master: Knowledge Amalgamation for Joint Scene Parsing, Depth Estimation, and More
Jingwen Ye
Yixin Ji
Xinchao Wang
Kairi Ou
Dapeng Tao
Xiuming Zhang
MoMe
24
75
0
23 Apr 2019
Salient Object Detection in the Deep Learning Era: An In-Depth Survey
Wenguan Wang
Qiuxia Lai
Huazhu Fu
Jianbing Shen
Haibin Ling
Ruigang Yang
43
610
0
19 Apr 2019
Feature Fusion for Online Mutual Knowledge Distillation
Jangho Kim
Minsung Hyun
Inseop Chung
Nojun Kwak
FedML
28
91
0
19 Apr 2019
End-to-End Speech Translation with Knowledge Distillation
Yuchen Liu
Hao Xiong
Zhongjun He
Jiajun Zhang
Hua Wu
Haifeng Wang
Chengqing Zong
32
151
0
17 Apr 2019
Biphasic Learning of GANs for High-Resolution Image-to-Image Translation
Jie Cao
Huaibo Huang
Yi Li
Jingtuo Liu
Ran He
Zhenan Sun
GAN
26
4
0
14 Apr 2019
Variational Information Distillation for Knowledge Transfer
Sungsoo Ahn
S. Hu
Andreas C. Damianou
Neil D. Lawrence
Zhenwen Dai
58
610
0
11 Apr 2019
Relational Knowledge Distillation
Wonpyo Park
Dongju Kim
Yan Lu
Minsu Cho
16
1,387
0
10 Apr 2019
Spatiotemporal Knowledge Distillation for Efficient Estimation of Aerial Video Saliency
Jia Li
K. Fu
Shengwei Zhao
Shiming Ge
38
26
0
10 Apr 2019
Knowledge Distillation For Recurrent Neural Network Language Modeling With Trust Regularization
Yangyang Shi
M. Hwang
X. Lei
Haoyu Sheng
34
25
0
08 Apr 2019
Semantic-Aware Knowledge Preservation for Zero-Shot Sketch-Based Image Retrieval
Qing Liu
Lingxi Xie
Huiyu Wang
Alan Yuille
VLM
27
108
0
05 Apr 2019
Branched Multi-Task Networks: Deciding What Layers To Share
Simon Vandenhende
Stamatios Georgoulis
Bert De Brabandere
Luc Van Gool
25
145
0
05 Apr 2019
Correlation Congruence for Knowledge Distillation
Baoyun Peng
Xiao Jin
Jiaheng Liu
Shunfeng Zhou
Yichao Wu
Yu Liu
Dongsheng Li
Zhaoning Zhang
63
507
0
03 Apr 2019
Training Quantized Neural Networks with a Full-precision Auxiliary Module
Bohan Zhuang
Lingqiao Liu
Mingkui Tan
Chunhua Shen
Ian Reid
MQ
32
62
0
27 Mar 2019
Towards Optimal Structured CNN Pruning via Generative Adversarial Learning
Shaohui Lin
Rongrong Ji
Chenqian Yan
Baochang Zhang
Liujuan Cao
QiXiang Ye
Feiyue Huang
David Doermann
CVBM
22
505
0
22 Mar 2019
Rectified Decision Trees: Towards Interpretability, Compression and Empirical Soundness
Jiawang Bai
Yiming Li
Jiawei Li
Yong Jiang
Shutao Xia
37
15
0
14 Mar 2019
Structured Knowledge Distillation for Dense Prediction
Yifan Liu
Chris Liu
Jingdong Wang
Zhenbo Luo
27
576
0
11 Mar 2019
A Learnable ScatterNet: Locally Invariant Convolutional Layers
Fergal Cotter
N. Kingsbury
23
22
0
07 Mar 2019
Copying Machine Learning Classifiers
Irene Unceta
Jordi Nin
O. Pujol
14
18
0
05 Mar 2019
Efficient Video Classification Using Fewer Frames
S. Bhardwaj
Mukundhan Srinivasan
Mitesh M. Khapra
42
88
0
27 Feb 2019
Previous
1
2
3
...
11
12
13
14
15
Next