Papers
Communities
Events
Blog
Pricing
Search
Open menu
Home
Papers
2011.07449
Cited By
Online Ensemble Model Compression using Knowledge Distillation
15 November 2020
Devesh Walawalkar
Zhiqiang Shen
Marios Savvides
Re-assign community
ArXiv
PDF
HTML
Papers citing
"Online Ensemble Model Compression using Knowledge Distillation"
10 / 10 papers shown
Title
HDKD: Hybrid Data-Efficient Knowledge Distillation Network for Medical Image Classification
Omar S. El-Assiouti
Ghada Hamed
Dina Khattab
H. M. Ebied
47
1
0
10 Jul 2024
The Road to On-board Change Detection: A Lightweight Patch-Level Change Detection Network via Exploring the Potential of Pruning and Pooling
Lihui Xue
Zhihao Wang
Xueqian Wang
Gang Li
50
1
0
16 Oct 2023
Rotation Invariant Quantization for Model Compression
Dor-Joseph Kampeas
Yury Nahshan
Hanoch Kremer
Gil Lederman
Shira Zaloshinski
Zheng Li
E. Haleva
MQ
23
1
0
03 Mar 2023
Mutual Contrastive Learning for Visual Representation Learning
Chuanguang Yang
Zhulin An
Linhang Cai
Yongjun Xu
VLM
SSL
106
75
0
26 Apr 2021
MEAL V2: Boosting Vanilla ResNet-50 to 80%+ Top-1 Accuracy on ImageNet without Tricks
Zhiqiang Shen
Marios Savvides
33
63
0
17 Sep 2020
Knowledge Distillation: A Survey
Jianping Gou
B. Yu
Stephen J. Maybank
Dacheng Tao
VLM
26
2,857
0
09 Jun 2020
CBNet: A Novel Composite Backbone Network Architecture for Object Detection
Yudong Liu
Yongtao Wang
Siwei Wang
Tingting Liang
Qijie Zhao
Zhi Tang
Haibin Ling
ObjD
209
244
0
09 Sep 2019
Knowledge Distillation by On-the-Fly Native Ensemble
Xu Lan
Xiatian Zhu
S. Gong
212
474
0
12 Jun 2018
Large scale distributed neural network training through online distillation
Rohan Anil
Gabriel Pereyra
Alexandre Passos
Róbert Ormándi
George E. Dahl
Geoffrey E. Hinton
FedML
278
404
0
09 Apr 2018
Aggregated Residual Transformations for Deep Neural Networks
Saining Xie
Ross B. Girshick
Piotr Dollár
Zhuowen Tu
Kaiming He
318
10,237
0
16 Nov 2016
1