Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
1412.6550
Cited By
FitNets: Hints for Thin Deep Nets
19 December 2014
Adriana Romero
Nicolas Ballas
Samira Ebrahimi Kahou
Antoine Chassang
C. Gatta
Yoshua Bengio
FedML
Re-assign community
ArXiv
PDF
HTML
Papers citing
"FitNets: Hints for Thin Deep Nets"
50 / 748 papers shown
Title
Arch-Net: Model Distillation for Architecture Agnostic Model Deployment
Weixin Xu
Zipeng Feng
Shuangkang Fang
Song Yuan
Yi Yang
Shuchang Zhou
MQ
30
1
0
01 Nov 2021
Distilling Object Detectors with Feature Richness
Zhixing Du
Rui Zhang
Ming-Fang Chang
Xishan Zhang
Shaoli Liu
Tianshi Chen
Yunji Chen
ObjD
24
74
0
01 Nov 2021
Learning Distilled Collaboration Graph for Multi-Agent Perception
Yiming Li
Shunli Ren
Pengxiang Wu
Siheng Chen
Chen Feng
Wenjun Zhang
32
239
0
01 Nov 2021
Revisiting Discriminator in GAN Compression: A Generator-discriminator Cooperative Compression Scheme
Shaojie Li
Jie Wu
Xuefeng Xiao
Rongrong Ji
Xudong Mao
Rongrong Ji
28
35
0
27 Oct 2021
Reconstructing Pruned Filters using Cheap Spatial Transformations
Roy Miles
K. Mikolajczyk
31
0
0
25 Oct 2021
Instance-Conditional Knowledge Distillation for Object Detection
Zijian Kang
Peizhen Zhang
Xinming Zhang
Jian Sun
N. Zheng
27
76
0
25 Oct 2021
MUSE: Feature Self-Distillation with Mutual Information and Self-Information
Yunpeng Gong
Ye Yu
Gaurav Mittal
Greg Mori
Mei Chen
SSL
32
2
0
25 Oct 2021
Pixel-by-Pixel Cross-Domain Alignment for Few-Shot Semantic Segmentation
A. Tavera
Fabio Cermelli
Carlo Masone
Barbara Caputo
29
19
0
22 Oct 2021
Augmenting Knowledge Distillation With Peer-To-Peer Mutual Learning For Model Compression
Usma Niyaz
Deepti R. Bathula
26
8
0
21 Oct 2021
Class-Discriminative CNN Compression
Yuchen Liu
D. Wentzlaff
S. Kung
26
1
0
21 Oct 2021
Adaptive Distillation: Aggregating Knowledge from Multiple Paths for Efficient Distillation
Sumanth Chennupati
Mohammad Mahdi Kamani
Zhongwei Cheng
Lin Chen
35
4
0
19 Oct 2021
Sub-bit Neural Networks: Learning to Compress and Accelerate Binary Neural Networks
Yikai Wang
Yi Yang
Gang Hua
Anbang Yao
MQ
29
15
0
18 Oct 2021
Object DGCNN: 3D Object Detection using Dynamic Graphs
Yue Wang
Justin Solomon
3DPC
157
104
0
13 Oct 2021
Towards Mixed-Precision Quantization of Neural Networks via Constrained Optimization
Weihan Chen
Peisong Wang
Jian Cheng
MQ
49
62
0
13 Oct 2021
Towards Streaming Egocentric Action Anticipation
Antonino Furnari
G. Farinella
EgoV
33
6
0
11 Oct 2021
KNOT: Knowledge Distillation using Optimal Transport for Solving NLP Tasks
Rishabh Bhardwaj
Tushar Vaidya
Soujanya Poria
OT
FedML
65
7
0
06 Oct 2021
Multilingual AMR Parsing with Noisy Knowledge Distillation
Deng Cai
Xin Li
Jackie Chun-Sing Ho
Lidong Bing
W. Lam
29
18
0
30 Sep 2021
Towards Efficient Post-training Quantization of Pre-trained Language Models
Haoli Bai
Lu Hou
Lifeng Shang
Xin Jiang
Irwin King
M. Lyu
MQ
82
47
0
30 Sep 2021
Prune Your Model Before Distill It
Jinhyuk Park
Albert No
VLM
54
27
0
30 Sep 2021
Deep Structured Instance Graph for Distilling Object Detectors
Yixin Chen
Pengguang Chen
Shu Liu
Liwei Wang
Jiaya Jia
ObjD
ISeg
23
12
0
27 Sep 2021
Partial to Whole Knowledge Distillation: Progressive Distilling Decomposed Knowledge Boosts Student Better
Xuanyang Zhang
Xinming Zhang
Jian Sun
30
1
0
26 Sep 2021
Weakly-Supervised Monocular Depth Estimationwith Resolution-Mismatched Data
Jialei Xu
Yuanchao Bai
Xianming Liu
Junjun Jiang
Xiangyang Ji
MDE
46
5
0
23 Sep 2021
LGD: Label-guided Self-distillation for Object Detection
Peizhen Zhang
Zijian Kang
Tong Yang
Xinming Zhang
N. Zheng
Jian Sun
ObjD
106
30
0
23 Sep 2021
Dynamic Knowledge Distillation for Pre-trained Language Models
Lei Li
Yankai Lin
Shuhuai Ren
Peng Li
Jie Zhou
Xu Sun
30
49
0
23 Sep 2021
A Studious Approach to Semi-Supervised Learning
Sahil Khose
Shruti Jain
V. Manushree
23
0
0
18 Sep 2021
New Perspective on Progressive GANs Distillation for One-class Novelty Detection
Zhiwei Zhang
Yu Dong
Hanyu Peng
Shifeng Chen
29
0
0
15 Sep 2021
On the Efficiency of Subclass Knowledge Distillation in Classification Tasks
A. Sajedi
Konstantinos N. Plataniotis
16
4
0
12 Sep 2021
Facial Anatomical Landmark Detection using Regularized Transfer Learning with Application to Fetal Alcohol Syndrome Recognition
Zeyu Fu
Jianbo Jiao
M. Suttie
J. A. Noble
CVBM
22
9
0
12 Sep 2021
Dual Correction Strategy for Ranking Distillation in Top-N Recommender System
Youngjune Lee
Kee-Eung Kim
22
19
0
08 Sep 2021
Knowledge Distillation Using Hierarchical Self-Supervision Augmented Distribution
Chuanguang Yang
Zhulin An
Linhang Cai
Yongjun Xu
37
15
0
07 Sep 2021
Dual Transfer Learning for Event-based End-task Prediction via Pluggable Event to Image Translation
Lin Wang
Yujeong Chae
Kuk-Jin Yoon
32
32
0
04 Sep 2021
Adversarial Robustness for Unsupervised Domain Adaptation
Muhammad Awais
Fengwei Zhou
Hang Xu
Lanqing Hong
Ping Luo
Sung-Ho Bae
Zhenguo Li
28
39
0
02 Sep 2021
Full-Cycle Energy Consumption Benchmark for Low-Carbon Computer Vision
Bo Li
Xinyang Jiang
Donglin Bai
Yuge Zhang
Ningxin Zheng
Xuanyi Dong
Lu Liu
Yuqing Yang
Dongsheng Li
14
10
0
30 Aug 2021
CoCo DistillNet: a Cross-layer Correlation Distillation Network for Pathological Gastric Cancer Segmentation
Wenxuan Zou
Muyi Sun
40
9
0
27 Aug 2021
Efficient Medical Image Segmentation Based on Knowledge Distillation
Dian Qin
Jiajun Bu
Zhe Liu
Xin Shen
Sheng Zhou
Jingjun Gu
Zhihong Wang
Lei Wu
Hui-Fen Dai
30
129
0
23 Aug 2021
LIGA-Stereo: Learning LiDAR Geometry Aware Representations for Stereo-based 3D Detector
Xiaoyang Guo
Shaoshuai Shi
Xiaogang Wang
Hongsheng Li
3DPC
36
106
0
18 Aug 2021
Joint Multiple Intent Detection and Slot Filling via Self-distillation
Lisong Chen
Peilin Zhou
Yuexian Zou
VLM
26
31
0
18 Aug 2021
G-DetKD: Towards General Distillation Framework for Object Detectors via Contrastive and Semantic-guided Feature Imitation
Lewei Yao
Renjie Pi
Hang Xu
Wei Zhang
Zhenguo Li
Tong Zhang
37
34
0
17 Aug 2021
Enhancing Self-supervised Video Representation Learning via Multi-level Feature Optimization
Rui Qian
Yuxi Li
Huabin Liu
John See
Shuangrui Ding
Xian Liu
Dian Li
Weiyao Lin
35
42
0
04 Aug 2021
Online Knowledge Distillation for Efficient Pose Estimation
Zheng Li
Jingwen Ye
Xiuming Zhang
Ying Huang
Zhigeng Pan
26
94
0
04 Aug 2021
Hierarchical Self-supervised Augmented Knowledge Distillation
Chuanguang Yang
Zhulin An
Linhang Cai
Yongjun Xu
SSL
35
76
0
29 Jul 2021
MFAGAN: A Compression Framework for Memory-Efficient On-Device Super-Resolution GAN
Wenlong Cheng
Mingbo Zhao
Zhiling Ye
Shuhang Gu
24
22
0
27 Jul 2021
ReSSL: Relational Self-Supervised Learning with Weak Augmentation
Mingkai Zheng
Shan You
Fei Wang
Chao Qian
Changshui Zhang
Xiaogang Wang
Chang Xu
23
113
0
20 Jul 2021
Double Similarity Distillation for Semantic Image Segmentation
Yingchao Feng
Xian Sun
Wenhui Diao
Jihao Li
Xin Gao
24
62
0
19 Jul 2021
Unpaired cross-modality educed distillation (CMEDL) for medical image segmentation
Jue Jiang
A. Rimner
Joseph O. Deasy
Harini Veeraraghavan
19
20
0
16 Jul 2021
Trustworthy AI: A Computational Perspective
Haochen Liu
Yiqi Wang
Wenqi Fan
Xiaorui Liu
Yaxin Li
Shaili Jain
Yunhao Liu
Anil K. Jain
Jiliang Tang
FaML
104
197
0
12 Jul 2021
Noise Stability Regularization for Improving BERT Fine-tuning
Hang Hua
Xingjian Li
Dejing Dou
Chengzhong Xu
Jiebo Luo
19
44
0
10 Jul 2021
Novel Visual Category Discovery with Dual Ranking Statistics and Mutual Knowledge Distillation
Bingchen Zhao
Kai Han
26
107
0
07 Jul 2021
Deep Learning for Micro-expression Recognition: A Survey
Yante Li
Jinsheng Wei
Yang Liu
Janne Kauttonen
Guoying Zhao
43
61
0
06 Jul 2021
A Light-weight Deep Human Activity Recognition Algorithm Using Multi-knowledge Distillation
Runze Chen
Haiyong Luo
Fang Zhao
Xuechun Meng
Zhiqing Xie
Yida Zhu
VLM
HAI
29
2
0
06 Jul 2021
Previous
1
2
3
...
7
8
9
...
13
14
15
Next